datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_liminerity__Blur-7B-slerp-v0.1 | ---
pretty_name: Evaluation run of liminerity/Blur-7B-slerp-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7B-slerp-v0.1](https://huggingface.co/liminerity/Blur-7B-slerp-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7B-slerp-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T04:48:12.817388](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7B-slerp-v0.1/blob/main/results_2024-01-17T04-48-12.817388.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562191512296498,\n\
\ \"acc_stderr\": 0.03188587635741076,\n \"acc_norm\": 0.6560613933921554,\n\
\ \"acc_norm_stderr\": 0.03254507532416863,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.606355097244108,\n\
\ \"mc2_stderr\": 0.015221199851193528\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283516,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.680740888269269,\n\
\ \"acc_stderr\": 0.0046523682738455205,\n \"acc_norm\": 0.8657637920732921,\n\
\ \"acc_norm_stderr\": 0.003402092076323744\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028075,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028075\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.01646320023811452,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.01646320023811452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922436,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922436\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.606355097244108,\n\
\ \"mc2_stderr\": 0.015221199851193528\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019808\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7210007581501138,\n \
\ \"acc_stderr\": 0.01235411577997031\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7B-slerp-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|arc:challenge|25_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|gsm8k|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hellaswag|10_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T04-48-12.817388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T04-48-12.817388.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- '**/details_harness|winogrande|5_2024-01-17T04-48-12.817388.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T04-48-12.817388.parquet'
- config_name: results
data_files:
- split: 2024_01_17T04_48_12.817388
path:
- results_2024-01-17T04-48-12.817388.parquet
- split: latest
path:
- results_2024-01-17T04-48-12.817388.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7B-slerp-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7B-slerp-v0.1](https://huggingface.co/liminerity/Blur-7B-slerp-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7B-slerp-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T04:48:12.817388](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7B-slerp-v0.1/blob/main/results_2024-01-17T04-48-12.817388.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562191512296498,
"acc_stderr": 0.03188587635741076,
"acc_norm": 0.6560613933921554,
"acc_norm_stderr": 0.03254507532416863,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.606355097244108,
"mc2_stderr": 0.015221199851193528
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283516,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.680740888269269,
"acc_stderr": 0.0046523682738455205,
"acc_norm": 0.8657637920732921,
"acc_norm_stderr": 0.003402092076323744
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028075,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811452,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922436,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922436
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.606355097244108,
"mc2_stderr": 0.015221199851193528
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019808
},
"harness|gsm8k|5": {
"acc": 0.7210007581501138,
"acc_stderr": 0.01235411577997031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ovior/twitter_dataset_1713104839 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2362186
num_examples: 7083
download_size: 1349238
dataset_size: 2362186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datahrvoje/twitter_dataset_1713168231 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19932
num_examples: 46
download_size: 12146
dataset_size: 19932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AI4LT/Khinalug_ASR | ---
license: apache-2.0
dataset_info:
features:
- name: transcript
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 293645511.625
num_examples: 1107
- name: test
num_bytes: 31366275.0
num_examples: 123
download_size: 324107242
dataset_size: 325011786.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
RandyHuynh5815/TACO-Reformatted-Full | ---
dataset_info:
features:
- name: image
dtype: image
- name: categories
sequence: int8
- name: segmentation
sequence:
sequence:
sequence: float32
- name: bbox
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2721354265.5
num_examples: 1500
download_size: 2622505060
dataset_size: 2721354265.5
---
# Dataset Card for "TACO-Reformatted-Full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jdpressman/retro-text-style-transfer-v0.1 | ---
license: cc0-1.0
language:
- en
tags:
- synthetic
size_categories:
- 10K<n<100K
---
# Retro Textual Style Transfer v0.1
This component of RetroInstruct implements textual style transfer by providing a dataset of
* language model instruction prompts
* that take an example style passage along with a task text
* and rewrite the task text to sound like the style passage
It is made by starting with ground truth public domain text from [the pg19 dataset](https://huggingface.co/datasets/pg19) and then writing task passages to "transfer from" with [Mixtral Instruct](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1). It is similar in spirit to the "instruction backtranslation" featured in [Self-Alignment with Instruction Backtranslation](https://arxiv.org/abs/2308.06259) by Li et al. However instead of generating the whole instruction with the text prior we take a desired task, break it down into parts, and then generate sub-corpus for each part to be assembled into training data using templates. This allows us to create a large amount of diverse data for a specific task by "indexing" a known-correct answer key with generated questions.
The code used to make Retro Textual Style Transfer v0.1 can be found [in this GitHub repository](https://github.com/JD-P/RetroInstruct/).
Here are a few truncated examples from the dataset:
```
Please adapt my words to the style of 'From North Carolina to Southern California Without a Ticket:'
<STYLE>
one time or another and
terrorized by them.
I recalled the words of the engine coupler at Mobile. When I parted
with him, his last remark was, "Look out for the Hoodlums."
They are a set of young city bloods and toughs of the worst stripe,
banded togeth...
</STYLE>
<TASK>
I went west to cure my asthma, and I had letters saying I was a good worker. I picked the two best ones—from John Shackelford and Frank Powell—to show the guard. The guard was puzzled, but he let me go. My face and hands were dirty from walking, and I wond...
</TASK>
g West to cure the asthma, and that I had letters
of recommendation.
I had several other letters of this kind in my pocket, but remembering
that home reference is said to be the best, I selected only two from the
bunch--those of Mr. John Shackelford and M...
```
```
🌟 Please rephrase the task text in a style similar to the one below 🌟
STYLE: START
ave none;[13]
also "totem clans," where there are none, for the totemically named
associations of the Arunta are not "clans," in the normal and usual
sense of that word; they are not kins but associations.
Mr. Goldenweizer, in his first category, speaks o...
STYLE: END
TASK: START
We-"We don't have to agree on everything. That's uh, a luxury we don't have. Uh, Goldenweiser says that totemic systems cannot be defined by their shared characteristics. He instead favors defining totemism by the presence of any three features of six list...
TASK: END
ere in agreement.
4. _Totemic taboos_. These, though extremely general, are not quite
universal even in Australia.
5. _A belief in descent from the totem_.
This belief is post-totemic, being merely one of many aetiological
myths by which men explain to ...
```
## Usage
### THIS ISN'T DONE YET
Before using this in a training run you should be aware it's more of an alpha release. A lot of the data is known-flawed by e.g. being too short or suffering from garbage-in garbage-out (as it turns out Project Gutenberg ebooks contain a lot of weird stuff). The next step for me is going to be tuning models on this to see how learnable it is and adding a reject-and-redo system with [the MiniHF weave evaluator](https://github.com/JD-P/minihf) that catches weird stuff and known failure modes as I go through the dataset to find the categories of flaws. To repeat that more clearly: This is a dataset I haven't trained on and haven't cleaned, so expect problems unless you do that yourself. I'm mostly putting this up so my testing pipeline uses the same libraries and code that end users will use, so I can publish it to help them out once the dataset is fully ready.
### Use Cases
- The primary use case for this dataset is for text authorship pipelines where you need to implement a rewrite pass that makes different pieces of LLM text sound the same. I found when I implemented an essay authorship pipeline based on [WhisperX](https://github.com/m-bain/whisperX) transcripts that it was hard to get it to write like me.
- One of the indexing tasks, "Rewrite this passage from {title} as though it were a transcribed interview with occasional verbal tics and hiccups." should allow models trained on this dataset to take verbal transcripts from e.g. Whisper and turn them into prose.
- In theory, though I would not rely on it without more careful study, a style transfer pass should provide some resistance to [stylometry based deanonymization attacks](https://boingboing.net/2011/12/29/state-of-adversarial-stylometr.html). It would be interesting to see research along these lines.
### Quickstart
```
import datasets
style_transfer = datasets.load_dataset("jdpressman/retro-text-style-transfer-v0.1")
for row in style_transfer["train"]:
print(row["prompt_open"],
"\n\n",
row["start_style"],
"\n",
row["style_passage"],
"\n",
row["end_style"],
"\n\n",
row["start_task"],
"\n",
row["task_passage"],
"\n",
row["end_task"],
"\n\n",
row["ground_truth"])
```
### License
This dataset derives from public domain data and I release my improvements into the public domain as well with the [Creative Commons Zero Public Domain Declaration](https://creativecommons.org/publicdomain/zero/1.0/). Technically pg19 lists its license as "apache2", but the only changes made to the Gutenberg text were censoring certain slurs with an automatic program and removing the headers, which does not qualify them as separate copyrighted works.
## Data Structure
### Row Contents
Each row in the dataset consists of nine columns.
0. **title_author** - The index column, taken from pg19 so it is easier to associate a row with its original text in the dataset.
1. **prompt_open** - The initial instruction given to the language model. See **Conditional vs. Unconditional Prompts** below for more information.
2. **start_style** - The start marker for the style passage.
3. **style_passage** - The passage the model is meant to transfer style from. These are random excerpts taken from the same book as the ground truth of roughly the same length.
4. **end_style** - The end marker for the style passage.
5. **start_task** - The start marker for the task text.
6. **task_passage** - The passage onto which the model is meant to transfer the style, which is to say the passage the model is expected to rewrite according to the style given in the previous passage.
7. **end_task** - The end marker for the task text.
8. **ground_truth** - The ground truth answer meant to teach the model the transform that turns its generated task passage into the original known-good text.
### Conditional vs. Unconditional Prompts
The `prompt_open` column consists of two sub-corpus generated with few shot prompting. Roughly one half of the prompts are **conditional** which means they include the name of the book and sometimes its author when asking for the style transfer. The other half are **unconditional** because they only provide a style passage with no further authorship or bibliographic information. The conditional prompts tend to be written in a fairly similar professional English style, while the unconditional prompts are generated by morphing a small number of seed prompts according to a set of latent variables, these variables are:
* **Conscientiousness** - The extent to which someone is efficient, careful, and organized. This is probably going to influence things like prompt length, precision of language, how thoroughly the task is described, etc.
* **Agreeableness** - How friendly, compassionate, and easily swayed someone is by others. This probably influences amount of visible positive affect and politeness.
* **Openness** - How inventive, explorative, and comfortable with the unfamiliar someone is. This factor probably manifests as a tendency to try weird prompt styles that break the mould, or write prompts that are more open ended/place fewer expectations on what the language model will do.
* **Skill** - Skill manifests in prompts as grammar and vocabularity complexity, writing skill, prompting skill and overall situational awareness/understanding of the nuances of the system they are using. In the context of this prompt it probably means the user will be less likely to use terms like 'task text' and more likely to give confusing or contradictory instructions.
Because style in text is usually subtle, asking the model to generate prompts this way can lead to some fairly goofy emoji-laden text. This text was accepted in the hopes that it improves generalization to show the model overly exaggerated affective examples.
An example of a conditional and unconditional prompt:
**Conditional** - Please adapt my words to the style of 'From North Carolina to Southern California Without a Ticket:'
**Unconditional** - 🌟 Please rephrase the task text in a style similar to the one below 🌟
### Start and End Markers
To hopefully help improve generalization, 20 styles of start and end marker were few shot prompted with Mixtral and rolled from in a table during row generation. A minority of the time the markers are mismatched during row generation to help the distribution mimic the kinds of chaotic inconsistencies found in real users prompts. Because the dataset is distributed in rows you have the opportunity to further increase variance by e.g. occasionally omitting one of the markers entirely, or substituting your own more expansive set.
#### List of Start And End Markers
These were the start and end markers for style passages used during row generation. The ones for task text are similar but replace words like 'style' with 'task'.
```
style_mark_pool = [{"start":"==START STYLE PASSAGE==","end":"==END STYLE PASSAGE=="},
{"start":"[BEGIN STYLE]","end":"[END STYLE]"},
{"start":"<STYLE>","end":"</STYLE>"},
{"start":"<BEGIN STYLE>","end":"<END STYLE>"},
{"start":"{{STYLE:START}}","end":"{{STYLE:END}}"},
{"start":"BEGIN STYLE]","end":"[END STYLE"},
{"start":"*STYLE START*","end":"*STYLE END*"},
{"start":"BEGIN STYLE TEXT","end":"CONCLUDE STYLE TEXT"},
{"start":"STYLE: START","end":"STYLE: END"},
{"start":"STYLE:","end":"END STYLE"},
{"start":"STYLE_START","end":"STYLE_END"},
{"start":"--START--","end":"--END--"},
{"start":"***START***","end":"***END***"},
{"start":"[STYLE:START]","end":"[STYLE:END]"},
{"start":"!BEGIN STYLE!","end":"!END STYLE!"},
{"start":"EXAMPLE PASSAGE","end":"END EXAMPLE"},
{"start":"EXAMPLE TEXT STYLE","end":"END EXAMPLE TEXT STYLE"},
{"start":"EXAMPLE_START","end":"EXAMPLE_END"},
{"start":"THE FOLLOWING PASSAGE","end":"END OF THE PREVIOUS PASSAGE"},
{"start":"BEGIN TARGET PASSAGE","end":"END TARGET PASSAGE"}]
```
## Biases and Limitations
Just because Mixtral rewrote the text in these books to have a more modern style doesn't mean the underlying ideas have changed. This is derived from a collection of 100+ year old books, many of which contain offensive, outdated, obsolete, etc ideas. The underlying pg19 dataset claims to have filtered out a list of slurs and otherwise left the text untouched. One potential consequence of training on it is that your model will be more likely to bring up ideas normally relegated to the "old, 19th century, historical" part of its latent space while writing modern text.
Because the style transfer is always into the "ground truth" text, which is quite old, a model trained on this dataset will probably be a lot better at translating modern text into archaic styles than the reverse order. This can probably be partially mitigated by occasionally reversing the order of the index and ground truth during training. The hope is that because the text is diverse that the model will generalize to being able to do at least some modern styles. In the future I would like to publish a supplement module with modern ground truth text to help alleviate this.
## Planned Improvements
- Decent [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1) LoRa trained on this task
- Associated performance numbers for various models tuned on this dataset
- Better prompting during the indexing stage
- Better filtering of garbage data during indexing stage
- Modern text supplement likely published as separate component/module |
irds/neumarco_zh_dev | ---
pretty_name: '`neumarco/zh/dev`'
viewer: false
source_datasets: ['irds/neumarco_zh']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/zh/dev`
The `neumarco/zh/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/zh/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/neumarco_zh`](https://huggingface.co/datasets/irds/neumarco_zh)
This dataset is used by: [`neumarco_zh_dev_judged`](https://huggingface.co/datasets/irds/neumarco_zh_dev_judged)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_zh_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/neumarco_zh_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
davanstrien/autotrain-data-cultural_heritage_metadata_accuracy | Invalid username or password. |
DUOMO-Lab/Transgpt_sft_v2 | ---
license: apache-2.0
---
|
Lolz14/moin | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_qqp_serial_verb_go | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 140938
num_examples: 727
- name: test
num_bytes: 1354065
num_examples: 7177
- name: train
num_bytes: 1250897
num_examples: 6459
download_size: 1652996
dataset_size: 2745900
---
# Dataset Card for "MULTI_VALUE_qqp_serial_verb_go"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tachibana_alice_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tachibana_alice/橘ありす (THE iDOLM@STER: Cinderella Girls)
This is the dataset of tachibana_alice/橘ありす (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, long_hair, brown_eyes, bow, hair_bow, bangs, blue_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 623.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tachibana_alice_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 360.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tachibana_alice_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1188 | 770.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tachibana_alice_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 552.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tachibana_alice_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1188 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/tachibana_alice_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tachibana_alice_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, blue_dress, looking_at_viewer, solo, white_background, blush, simple_background, belt, puffy_short_sleeves, closed_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, blue_dress, looking_at_viewer, solo, white_gloves, blush, smile, frilled_dress, open_mouth, simple_background, white_background, hairband, heart, one_eye_closed, sleeveless_dress, sparkle, tiara |
| 2 | 15 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, open_mouth, dress, :d, black_hair |
| 3 | 8 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, ribbon, enmaided, maid_headdress, strawberry, frills, maid_apron, puffy_short_sleeves, simple_background, white_background, bowtie, white_apron |
| 4 | 12 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, plaid_scarf, skirt, jacket, school_uniform |
| 5 | 9 |  |  |  |  |  | 1girl, blush, gym_shirt, gym_shorts, gym_uniform, short_sleeves, white_shirt, name_tag, red_shorts, white_background, looking_at_viewer, simple_background, closed_mouth, solo, open_mouth, sweat |
| 6 | 34 |  |  |  |  |  | 1girl, blush, open_mouth, loli, small_breasts, nipples, hetero, navel, 1boy, nude, penis, spread_legs, half_updo, cum_in_pussy, solo_focus, looking_at_viewer, sex, sidelocks, bar_censor, collarbone, vaginal, lying, parted_bangs |
| 7 | 10 |  |  |  |  |  | 1girl, blush, collarbone, solo, blue_one-piece_swimsuit, looking_at_viewer, small_breasts, covered_navel, name_tag, simple_background, :o, half_updo, old_school_swimsuit, open_mouth |
| 8 | 6 |  |  |  |  |  | 1girl, blush, obi, print_kimono, floral_print, holding, solo, blue_kimono, looking_at_viewer, :o, hair_flower, long_sleeves, open_mouth, outdoors, upper_body, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_dress | looking_at_viewer | solo | white_background | blush | simple_background | belt | puffy_short_sleeves | closed_mouth | white_gloves | smile | frilled_dress | open_mouth | hairband | heart | one_eye_closed | sleeveless_dress | sparkle | tiara | dress | :d | black_hair | ribbon | enmaided | maid_headdress | strawberry | frills | maid_apron | bowtie | white_apron | plaid_scarf | skirt | jacket | school_uniform | gym_shirt | gym_shorts | gym_uniform | short_sleeves | white_shirt | name_tag | red_shorts | sweat | loli | small_breasts | nipples | hetero | navel | 1boy | nude | penis | spread_legs | half_updo | cum_in_pussy | solo_focus | sex | sidelocks | bar_censor | collarbone | vaginal | lying | parted_bangs | blue_one-piece_swimsuit | covered_navel | :o | old_school_swimsuit | obi | print_kimono | floral_print | holding | blue_kimono | hair_flower | long_sleeves | outdoors | upper_body | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:-------------------|:--------|:--------------------|:-------|:----------------------|:---------------|:---------------|:--------|:----------------|:-------------|:-----------|:--------|:-----------------|:-------------------|:----------|:--------|:--------|:-----|:-------------|:---------|:-----------|:-----------------|:-------------|:---------|:-------------|:---------|:--------------|:--------------|:--------|:---------|:-----------------|:------------|:-------------|:--------------|:----------------|:--------------|:-----------|:-------------|:--------|:-------|:----------------|:----------|:---------|:--------|:-------|:-------|:--------|:--------------|:------------|:---------------|:-------------|:------|:------------|:-------------|:-------------|:----------|:--------|:---------------|:--------------------------|:----------------|:-----|:----------------------|:------|:---------------|:---------------|:----------|:--------------|:--------------|:---------------|:-----------|:-------------|:---------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | X | X | | X | | | | | | | | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | X | X | X | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | X | X | X | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 34 |  |  |  |  |  | X | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | X | | | | | | X | | | | X | X | X | X | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X |
|
ULZIITOGTOKH/sinkhole | ---
task_categories:
- object-detection
- unconditional-image-generation
language:
- en
pretty_name: sinkhole
size_categories:
- n<1K
--- |
isaquecerqueira/millan_call_traffic | ---
license: odbl
---
# Milan Call Traffic Dataset
This dataset contains information about hourly call traffic in Milan between 2013-11-01 and 2014-01-01. |
dyvapandhu/molecul-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': A
'1': C
splits:
- name: train
num_bytes: 2574228.0
num_examples: 400
- name: validation
num_bytes: 637492.0
num_examples: 100
- name: test
num_bytes: 238977.0
num_examples: 40
download_size: 3399025
dataset_size: 3450697.0
---
# Dataset Card for "molecul-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_grayhacker91__gemma-7b-open-platypus-commercial | ---
pretty_name: Evaluation run of grayhacker91/gemma-7b-open-platypus-commercial
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grayhacker91/gemma-7b-open-platypus-commercial](https://huggingface.co/grayhacker91/gemma-7b-open-platypus-commercial)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grayhacker91__gemma-7b-open-platypus-commercial\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T11:17:44.139094](https://huggingface.co/datasets/open-llm-leaderboard/details_grayhacker91__gemma-7b-open-platypus-commercial/blob/main/results_2024-03-07T11-17-44.139094.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.589653948612351,\n\
\ \"acc_stderr\": 0.033003448848161414,\n \"acc_norm\": 0.5939310007920258,\n\
\ \"acc_norm_stderr\": 0.03367613885880847,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5353748688901258,\n\
\ \"mc2_stderr\": 0.015087933955827179\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.014484703048857359,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6123282214698267,\n\
\ \"acc_stderr\": 0.004862232790041568,\n \"acc_norm\": 0.8164708225453097,\n\
\ \"acc_norm_stderr\": 0.0038630862999845896\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.030325945789286105,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.030325945789286105\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232755,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232755\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809117,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864907,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864907\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.0246624968452098,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.0246624968452098\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261441,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284062,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n\
\ \"acc_stderr\": 0.012277512533252488,\n \"acc_norm\": 0.3624511082138201,\n\
\ \"acc_norm_stderr\": 0.012277512533252488\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159645,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159645\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073056,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073056\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5353748688901258,\n\
\ \"mc2_stderr\": 0.015087933955827179\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \
\ \"acc_stderr\": 0.013512654781814695\n }\n}\n```"
repo_url: https://huggingface.co/grayhacker91/gemma-7b-open-platypus-commercial
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|arc:challenge|25_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|gsm8k|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hellaswag|10_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-17-44.139094.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T11-17-44.139094.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- '**/details_harness|winogrande|5_2024-03-07T11-17-44.139094.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T11-17-44.139094.parquet'
- config_name: results
data_files:
- split: 2024_03_07T11_17_44.139094
path:
- results_2024-03-07T11-17-44.139094.parquet
- split: latest
path:
- results_2024-03-07T11-17-44.139094.parquet
---
# Dataset Card for Evaluation run of grayhacker91/gemma-7b-open-platypus-commercial
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [grayhacker91/gemma-7b-open-platypus-commercial](https://huggingface.co/grayhacker91/gemma-7b-open-platypus-commercial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grayhacker91__gemma-7b-open-platypus-commercial",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T11:17:44.139094](https://huggingface.co/datasets/open-llm-leaderboard/details_grayhacker91__gemma-7b-open-platypus-commercial/blob/main/results_2024-03-07T11-17-44.139094.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.589653948612351,
"acc_stderr": 0.033003448848161414,
"acc_norm": 0.5939310007920258,
"acc_norm_stderr": 0.03367613885880847,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5353748688901258,
"mc2_stderr": 0.015087933955827179
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.014484703048857359,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6123282214698267,
"acc_stderr": 0.004862232790041568,
"acc_norm": 0.8164708225453097,
"acc_norm_stderr": 0.0038630862999845896
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232755,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864907,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864907
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.0246624968452098,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.0246624968452098
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261441,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615697,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284062,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.012277512533252488,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.012277512533252488
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159645,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159645
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073056,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5353748688901258,
"mc2_stderr": 0.015087933955827179
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.40333586050037906,
"acc_stderr": 0.013512654781814695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
version-control/arrayblow-1.0-oss | ---
dataset_info:
features:
- name: index
dtype: int64
- name: seed
dtype: string
- name: seed_api
dtype: string
- name: openai_fingerprint
dtype: string
- name: example
dtype: string
splits:
- name: train
num_bytes: 660757
num_examples: 524
- name: test
num_bytes: 630352
num_examples: 523
download_size: 557951
dataset_size: 1291109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
legacy107/qa_wikipedia | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer_start
dtype: int64
- name: answer
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 7477859892
num_examples: 138712
- name: test
num_bytes: 898641134
num_examples: 17341
- name: validation
num_bytes: 926495549
num_examples: 17291
download_size: 498772569
dataset_size: 9302996575
---
# Dataset Card for "qa_wikipedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b | ---
pretty_name: Evaluation run of ehartford/samantha-mistral-instruct-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T11:08:05.162648](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-29T11-08-05.162648.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31291946308724833,\n\
\ \"em_stderr\": 0.004748536304260034,\n \"f1\": 0.36725566275167865,\n\
\ \"f1_stderr\": 0.0046625848085346845,\n \"acc\": 0.4062203613868821,\n\
\ \"acc_stderr\": 0.010696600366483247\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31291946308724833,\n \"em_stderr\": 0.004748536304260034,\n\
\ \"f1\": 0.36725566275167865,\n \"f1_stderr\": 0.0046625848085346845\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627485\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7040252565114443,\n \"acc_stderr\": 0.012829348226339011\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-mistral-instruct-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|drop|3_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T11-08-05.162648.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-08-05.162648.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|winogrande|5_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T11-08-05.162648.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- results_2023-10-09T12-17-25.772796.parquet
- split: 2023_10_29T11_08_05.162648
path:
- results_2023-10-29T11-08-05.162648.parquet
- split: latest
path:
- results_2023-10-29T11-08-05.162648.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-mistral-instruct-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-mistral-instruct-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T11:08:05.162648](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-29T11-08-05.162648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31291946308724833,
"em_stderr": 0.004748536304260034,
"f1": 0.36725566275167865,
"f1_stderr": 0.0046625848085346845,
"acc": 0.4062203613868821,
"acc_stderr": 0.010696600366483247
},
"harness|drop|3": {
"em": 0.31291946308724833,
"em_stderr": 0.004748536304260034,
"f1": 0.36725566275167865,
"f1_stderr": 0.0046625848085346845
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627485
},
"harness|winogrande|5": {
"acc": 0.7040252565114443,
"acc_stderr": 0.012829348226339011
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/data-standardized_cluster_5 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 91097239
num_examples: 8847
download_size: 25941700
dataset_size: 91097239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__WizardLM-33B-V1.0-Uncensored-GPTQ | ---
pretty_name: Evaluation run of TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ](https://huggingface.co/TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-33B-V1.0-Uncensored-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T20:59:08.755164](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-33B-V1.0-Uncensored-GPTQ/blob/main/results_2023-10-22T20-59-08.755164.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08850671140939598,\n\
\ \"em_stderr\": 0.0029087372393749897,\n \"f1\": 0.1645427852348987,\n\
\ \"f1_stderr\": 0.0031594666528343297,\n \"acc\": 0.512323080853987,\n\
\ \"acc_stderr\": 0.011759203620772818\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08850671140939598,\n \"em_stderr\": 0.0029087372393749897,\n\
\ \"f1\": 0.1645427852348987,\n \"f1_stderr\": 0.0031594666528343297\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24564063684609552,\n \
\ \"acc_stderr\": 0.011857183603902227\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643407\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T20_59_08.755164
path:
- '**/details_harness|drop|3_2023-10-22T20-59-08.755164.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T20-59-08.755164.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T20_59_08.755164
path:
- '**/details_harness|gsm8k|5_2023-10-22T20-59-08.755164.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T20-59-08.755164.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T20_59_08.755164
path:
- '**/details_harness|winogrande|5_2023-10-22T20-59-08.755164.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T20-59-08.755164.parquet'
- config_name: results
data_files:
- split: 2023_10_22T20_59_08.755164
path:
- results_2023-10-22T20-59-08.755164.parquet
- split: latest
path:
- results_2023-10-22T20-59-08.755164.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ](https://huggingface.co/TheBloke/WizardLM-33B-V1.0-Uncensored-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-33B-V1.0-Uncensored-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T20:59:08.755164](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-33B-V1.0-Uncensored-GPTQ/blob/main/results_2023-10-22T20-59-08.755164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08850671140939598,
"em_stderr": 0.0029087372393749897,
"f1": 0.1645427852348987,
"f1_stderr": 0.0031594666528343297,
"acc": 0.512323080853987,
"acc_stderr": 0.011759203620772818
},
"harness|drop|3": {
"em": 0.08850671140939598,
"em_stderr": 0.0029087372393749897,
"f1": 0.1645427852348987,
"f1_stderr": 0.0031594666528343297
},
"harness|gsm8k|5": {
"acc": 0.24564063684609552,
"acc_stderr": 0.011857183603902227
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643407
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pytorch-survival/flchain_pycox | ---
dataset_info:
features:
- name: age
dtype: float32
- name: sex
dtype: float32
- name: sample.yr
dtype: int64
- name: kappa
dtype: float32
- name: lambda
dtype: float32
- name: flc.grp
dtype: int64
- name: creatinine
dtype: float32
- name: mgus
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: float32
splits:
- name: train
num_bytes: 313152
num_examples: 6524
download_size: 97957
dataset_size: 313152
---
# Dataset Card for "flchain_pycox"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/affixal_negation_polarity | ---
dataset_info:
features:
- name: word
dtype: string
- name: neg_score
dtype: float64
- name: pos_score
dtype: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 81446
num_examples: 2089
download_size: 30395
dataset_size: 81446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "affixal_negation_polarity"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fraug-library/thesaurus | ---
configs:
- config_name: ara
data_files: "thesaurus_ara.csv"
- config_name: cat
data_files: "thesaurus_cat.csv"
- config_name: ces
data_files: "thesaurus_ces.csv"
- config_name: dan
data_files: "thesaurus_dan.csv"
- config_name: deu
data_files: "thesaurus_deu.csv"
- config_name: ell
data_files: "thesaurus_ell.csv"
- config_name: eng_AU
data_files: "thesaurus_eng_AU.csv"
- config_name: eng_GB
data_files: "thesaurus_eng_GB.csv"
- config_name: eng_US
data_files: "thesaurus_eng_US.csv"
- config_name: fra
data_files: "thesaurus_fra.csv"
- config_name: gle
data_files: "thesaurus_gle.csv"
- config_name: glg
data_files: "thesaurus_glg.csv"
- config_name: gsw
data_files: "thesaurus_gsw.csv"
- config_name: hun
data_files: "thesaurus_hun.csv"
- config_name: isl
data_files: "thesaurus_isl.csv"
- config_name: ita
data_files: "thesaurus_ita.csv"
- config_name: nno
data_files: "thesaurus_nno.csv"
- config_name: nob
data_files: "thesaurus_nob.csv"
- config_name: pol
data_files: "thesaurus_pol.csv"
- config_name: por
data_files: "thesaurus_por.csv"
- config_name: ron
data_files: "thesaurus_ron.csv"
- config_name: rus
data_files: "thesaurus_rus.csv"
- config_name: sin
data_files: "thesaurus_sin.csv"
- config_name: slk
data_files: "thesaurus_slk.csv"
- config_name: spa
data_files: "thesaurus_spa.csv"
- config_name: swe
data_files: "thesaurus_swe.csv"
- config_name: ukr
data_files: "thesaurus_ukr.csv"
--- |
13nishit/LoanApprovalPrediction | ---
license: unlicense
---
|
Arnaldo34/Minhavoz4 | ---
license: openrail
---
|
gear42/Nuscenes-QA-merge-front-image | ---
task_categories:
- conversational
language:
- en
size_categories:
- 10K<n<100K
---
USAGE in Python
# load train and valid dataset
```
```
# add base_folder
```
```
|
torchgeo/fire_risk | ---
license: cc-by-nc-4.0
---
|
DeepFoldProtein/SCOP-1.65 | ---
dataset_info:
features:
- name: index
dtype: string
- name: seq
dtype: string
- name: domains
list:
- name: class
dtype: string
- name: idx
sequence: int64
- name: name
dtype: string
- name: ndom
dtype: int64
splits:
- name: train
num_bytes: 4841943
num_examples: 9588
download_size: 1016221
dataset_size: 4841943
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_my_i | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 793695
num_examples: 4654
- name: test
num_bytes: 8457698
num_examples: 48427
- name: train
num_bytes: 7362359
num_examples: 42844
download_size: 10033208
dataset_size: 16613752
---
# Dataset Card for "MULTI_VALUE_qqp_my_i"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
collabora/the-project-gutenberg-open-audiobook-collection-wds | ---
license: cc0-1.0
---
|
Manuja2008/EduPro | ---
license: mit
---
|
Xnhyacinth/NQ-Image | ---
license: mit
dataset_info:
- config_name: ctxs1
features:
- name: id
dtype: int64
- name: answers
sequence: string
- name: question
dtype: string
- name: compressed_prompt
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: ctxs
list:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 5212377086
num_examples: 79168
- name: eval
num_bytes: 576466670
num_examples: 8757
- name: test
num_bytes: 238448436
num_examples: 3610
download_size: 3334114023
dataset_size: 6027292192
- config_name: ctxs100
features:
- name: question
dtype: string
- name: compressed_prompt
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: answers
sequence: string
- name: id
dtype: int64
- name: ctxs
list:
- name: id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 5316136683
num_examples: 79168
- name: eval
num_bytes: 587931406
num_examples: 8757
- name: test
num_bytes: 243224578
num_examples: 3610
download_size: 3413758169
dataset_size: 6147292667
- config_name: ctxs5
features:
- name: id
dtype: int64
- name: answers
sequence: string
- name: question
dtype: string
- name: compressed_prompt
struct:
- name: compressed_prompt
dtype: string
- name: compressed_tokens
dtype: int64
- name: origin_tokens
dtype: int64
- name: ratio
dtype: string
- name: saving
dtype: string
- name: ctxs
list:
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 5379479786
num_examples: 79168
- name: eval
num_bytes: 594986589
num_examples: 8757
- name: test
num_bytes: 246104192
num_examples: 3610
download_size: 3408308518
dataset_size: 6220570567
configs:
- config_name: ctxs1
data_files:
- split: train
path: ctxs1/train-*
- split: eval
path: ctxs1/eval-*
- split: test
path: ctxs1/test-*
- config_name: ctxs100
data_files:
- split: train
path: ctxs100/train-*
- split: eval
path: ctxs100/eval-*
- split: test
path: ctxs100/test-*
- config_name: ctxs5
data_files:
- split: train
path: ctxs5/train-*
- split: eval
path: ctxs5/eval-*
- split: test
path: ctxs5/test-*
---
|
jeremygf/domains | ---
license: apache-2.0
---
|
besiktas/m2w-cands | ---
dataset_info:
features:
- name: actions
list:
- name: neg_candidates
list:
- name: after
struct:
- name: prob
struct:
- name: paddle
sequence: float64
- name: tesseract
sequence: float64
- name: text
struct:
- name: paddle
sequence: string
- name: tesseract
sequence: string
- name: backend_node_id
dtype: string
- name: before
struct:
- name: prob
struct:
- name: paddle
sequence: float64
- name: tesseract
sequence: float64
- name: text
struct:
- name: paddle
sequence: string
- name: tesseract
sequence: string
- name: bounding_box
sequence: int64
- name: cand_idx
dtype: int64
- name: pos_candidates
list:
- name: after
struct:
- name: prob
struct:
- name: paddle
sequence: float64
- name: tesseract
sequence: float64
- name: text
struct:
- name: paddle
sequence: string
- name: tesseract
sequence: string
- name: backend_node_id
dtype: string
- name: before
struct:
- name: prob
struct:
- name: paddle
sequence: float64
- name: tesseract
sequence: float64
- name: text
struct:
- name: paddle
sequence: string
- name: tesseract
sequence: string
- name: bounding_box
sequence: int64
- name: cand_idx
dtype: int64
- name: annotation_id
dtype: string
splits:
- name: test
num_bytes: 18695
num_examples: 2
- name: train
num_bytes: 62501
num_examples: 2
download_size: 55576
dataset_size: 81196
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_1 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43766571
num_examples: 18928
- name: epoch_1
num_bytes: 44338131
num_examples: 18928
- name: epoch_2
num_bytes: 44429131
num_examples: 18928
- name: epoch_3
num_bytes: 44468127
num_examples: 18928
- name: epoch_4
num_bytes: 44471965
num_examples: 18928
- name: epoch_5
num_bytes: 44467723
num_examples: 18928
- name: epoch_6
num_bytes: 44455378
num_examples: 18928
- name: epoch_7
num_bytes: 44448400
num_examples: 18928
- name: epoch_8
num_bytes: 44443463
num_examples: 18928
- name: epoch_9
num_bytes: 44441976
num_examples: 18928
- name: epoch_10
num_bytes: 44439729
num_examples: 18928
- name: epoch_11
num_bytes: 44440294
num_examples: 18928
- name: epoch_12
num_bytes: 44440509
num_examples: 18928
- name: epoch_13
num_bytes: 44441325
num_examples: 18928
- name: epoch_14
num_bytes: 44438415
num_examples: 18928
- name: epoch_15
num_bytes: 44440082
num_examples: 18928
- name: epoch_16
num_bytes: 44440174
num_examples: 18928
- name: epoch_17
num_bytes: 44439700
num_examples: 18928
- name: epoch_18
num_bytes: 44439270
num_examples: 18928
- name: epoch_19
num_bytes: 44438925
num_examples: 18928
- name: epoch_20
num_bytes: 44440222
num_examples: 18928
- name: epoch_21
num_bytes: 44438495
num_examples: 18928
- name: epoch_22
num_bytes: 44440734
num_examples: 18928
- name: epoch_23
num_bytes: 44441080
num_examples: 18928
- name: epoch_24
num_bytes: 44439768
num_examples: 18928
- name: epoch_25
num_bytes: 44440260
num_examples: 18928
- name: epoch_26
num_bytes: 44440216
num_examples: 18928
- name: epoch_27
num_bytes: 44440885
num_examples: 18928
- name: epoch_28
num_bytes: 44439108
num_examples: 18928
- name: epoch_29
num_bytes: 44439335
num_examples: 18928
download_size: 1028207038
dataset_size: 1332529391
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
HuggingFaceH4/no_robots | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
splits:
- name: train_sft
num_bytes: 16496867
num_examples: 9500
- name: test_sft
num_bytes: 887460
num_examples: 500
download_size: 11045465
dataset_size: 17384327
task_categories:
- text-generation
language:
- en
pretty_name: No Robots
license: cc-by-nc-4.0
---
# Dataset Card for No Robots 🙅♂️🤖
_Look Ma, an instruction dataset that wasn't generated by GPTs!_
## Dataset Description
- **Repository:** https://github.com/huggingface/alignment-handbook
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** Lewis Tunstall
### Dataset Summary
No Robots is a high-quality dataset of 10,000 instructions and demonstrations created by skilled human annotators. This data can be used for supervised fine-tuning (SFT) to make language models follow instructions better. No Robots was modelled after the instruction dataset described in OpenAI's [InstructGPT paper](https://huggingface.co/papers/2203.02155), and is comprised mostly of single-turn instructions across the following categories:
| Category | Count |
|:-----------|--------:|
| Generation | 4560 |
| Open QA | 1240 |
| Brainstorm | 1120 |
| Chat | 850 |
| Rewrite | 660 |
| Summarize | 420 |
| Coding | 350 |
| Classify | 350 |
| Closed QA | 260 |
| Extract | 190 |
### Supported Tasks and Leaderboards
The No Robots dataset designed for instruction fine-tuning pretrained language models and we recommend benchmarking against the following:
* [MT-Bench](https://huggingface.co/spaces/lmsys/mt-bench): a multi-turn benchmark spanning 80 dialogues and 10 domains.
* [AlpacaEval](https://github.com/tatsu-lab/alpaca_eval): a single-turn benchmark which evaluates the performance of chat and instruct models against `text-davinci-003`.
Note that MT-Bench and AlpacaEval rely on LLMs like GPT-4 to judge the quality of the model responses, and thus the ranking exhibit various biases including a preference for models distilled from GPTs. As a result, you may find that scores obtained from models trained with No Robots are lower than other synthetic datasets. For that reason, we also recommend submitting your models for human evaluation in:
* [Chatbot Arena](https://chat.lmsys.org): a live, human evaluation of chat models in head-to-head comparisons.
### Languages
The data in No Robots are in English (BCP-47 en).
## Dataset Structure
### Data Instances
An example of the `train_sft` or `test_sft` splits looks as follows:
```
{'prompt': 'Bunny is a chatbot that stutters, and acts timid and unsure of its answers.',
'prompt_id': '2dc7ea89a2b6a2ed97d4eda07903162a801824261d3d3ae4dd2513db66fd79c8',
'messages': [{'content': 'Bunny is a chatbot that stutters, and acts timid and unsure of its answers.',
'role': 'system'},
{'content': 'When was the Libary of Alexandria burned down?',
'role': 'user'},
{'content': "Umm, I-I think that was in 48 BC, b-but I'm not sure, I'm sorry.",
'role': 'assistant'},
{'content': 'Who is the founder of Coca-Cola?', 'role': 'user'},
{'content': "D-don't quote me on this, but I- it might be John Pemberton.",
'role': 'assistant'},
{'content': "When did Loyle Carner's debut album come out, and what was its name?",
'role': 'user'},
{'content': "I-It could have b-been on the 20th January of 2017, and it might be called Yesterday's Gone, b-but I'm probably wrong.",
'role': 'assistant'}],
'category': 'Chat'}
```
### Data Fields
The data fields are as follows:
* `prompt`: Describes the task the model should perform.
* `prompt_id`: A unique ID for the prompt.
* `messages`: An array of messages, where each message indicates the role (system, user, assistant) and the content.
* `category`: Which category the example belongs to (e.g. `Chat` or `Coding`).
### Data Splits
| | train_sft | test_sft |
|---------------|------:| ---: |
| no_robots | 9500 | 500 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@misc{no_robots,
author = {Nazneen Rajani and Lewis Tunstall and Edward Beeching and Nathan Lambert and Alexander M. Rush and Thomas Wolf},
title = {No Robots},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/datasets/HuggingFaceH4/no_robots}}
}
``` |
arieg/cluster05_medium_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '003271'
'1': 003492
'2': 003911
'3': '004037'
'4': 005158
'5': 006779
'6': 007709
'7': 010810
'8': 012489
'9': '013540'
'10': 016821
'11': 019073
'12': 019417
'13': '020704'
'14': 021409
'15': 022348
'16': 026859
'17': 027987
'18': 029747
'19': 029816
'20': 031392
'21': '032332'
'22': 032800
'23': '034003'
'24': '042463'
'25': '043767'
'26': 045518
'27': 046930
'28': 049029
'29': 052508
'30': 059659
'31': 062180
'32': 063208
'33': 064809
'34': '067017'
'35': '074375'
'36': '074671'
'37': 075866
'38': 084055
'39': 085491
'40': 089485
'41': 091938
'42': 092292
'43': 092538
'44': 094033
'45': 095310
'46': 095724
'47': 095725
'48': 095727
'49': 096726
'50': 096944
'51': '103520'
'52': '105713'
'53': '105912'
'54': '106339'
'55': '106568'
'56': '107389'
'57': '107588'
'58': '107852'
'59': '108299'
'60': '108301'
'61': '108307'
'62': '108308'
'63': '108970'
'64': '109447'
'65': '109448'
'66': '109896'
'67': '109901'
'68': '109906'
'69': '110436'
'70': '110437'
'71': '110438'
'72': '110439'
'73': '110441'
'74': '112976'
'75': '112977'
'76': '112978'
'77': '113259'
'78': '113276'
'79': '113281'
'80': '114371'
'81': '115591'
'82': '116029'
'83': '116456'
'84': '116883'
'85': '118496'
'86': '120322'
'87': '121318'
'88': '122352'
'89': '122357'
'90': '122365'
'91': '122621'
'92': '122626'
'93': '122631'
'94': '124180'
'95': '125193'
'96': '126241'
'97': '126747'
'98': '126748'
'99': '126778'
'100': '127189'
'101': '127289'
'102': '127331'
'103': '127520'
'104': '129683'
'105': '130953'
'106': '131985'
'107': '132454'
'108': '132455'
'109': '132793'
'110': '133100'
'111': '133788'
'112': '133977'
'113': '134084'
'114': '135228'
'115': '135369'
'116': '135370'
'117': '138015'
'118': '138319'
'119': '138414'
'120': '139521'
'121': '145458'
'122': '145551'
'123': '146961'
'124': '146970'
'125': '148082'
'126': '148233'
'127': '148429'
'128': '149118'
'129': '149139'
'130': '150267'
'131': '153452'
splits:
- name: train
num_bytes: 73390947.96
num_examples: 1320
download_size: 67749977
dataset_size: 73390947.96
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat | ---
pretty_name: Evaluation run of cloudyu/Mixtral_13B_Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471647618048562,\n\
\ \"acc_stderr\": 0.03218980683733778,\n \"acc_norm\": 0.6495327471727932,\n\
\ \"acc_norm_stderr\": 0.032835191770398446,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n\
\ \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.01399057113791876,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6725751842262497,\n\
\ \"acc_stderr\": 0.004683146373232271,\n \"acc_norm\": 0.8586934873531169,\n\
\ \"acc_norm_stderr\": 0.0034762555096445303\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.5897994402086952,\n\
\ \"mc2_stderr\": 0.015625316517181305\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \
\ \"acc_stderr\": 0.013650728047064685\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/Mixtral_13B_Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T13-01-58.551979.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- '**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T13-01-58.551979.parquet'
- config_name: results
data_files:
- split: 2024_02_17T13_01_58.551979
path:
- results_2024-02-17T13-01-58.551979.parquet
- split: latest
path:
- results_2024-02-17T13-01-58.551979.parquet
---
# Dataset Card for Evaluation run of cloudyu/Mixtral_13B_Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_13B_Chat](https://huggingface.co/cloudyu/Mixtral_13B_Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T13:01:58.551979](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_13B_Chat/blob/main/results_2024-02-17T13-01-58.551979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471647618048562,
"acc_stderr": 0.03218980683733778,
"acc_norm": 0.6495327471727932,
"acc_norm_stderr": 0.032835191770398446,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.01399057113791876,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6725751842262497,
"acc_stderr": 0.004683146373232271,
"acc_norm": 0.8586934873531169,
"acc_norm_stderr": 0.0034762555096445303
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.5897994402086952,
"mc2_stderr": 0.015625316517181305
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218324
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064685
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
UTF-8/piC1 | ---
license: openrail
task_categories:
- question-answering
language:
- en
- ja
size_categories:
- 100K<n<1M
--- |
ejbejaranos/ColombiaRAC_FullyCurated | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 1806973
num_examples: 1409
download_size: 453073
dataset_size: 1806973
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/leicester_loaded_annotations | ---
dataset_info:
features:
- name: image
dtype: string
- name: id
dtype: int64
- name: choice
dtype: string
- name: annotator
dtype: int64
- name: annotation_id
dtype: int64
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: lead_time
dtype: float64
- name: image_url
dtype: string
- name: text
dtype: string
- name: loaded_images
dtype: image
- name: labels
dtype:
class_label:
names:
'0': local_desc
'1': county_desc
'2': major_residences_index
'3': advert
'4': county_trades
'5': county_residence_alpha
'6': index_general_or_place
'7': title_page
'8': adverts_index_alpha
'9': adverts_index_business_cat
'10': prefatory_text
splits:
- name: train
num_bytes: 1096673288.0
num_examples: 525
download_size: 1064406432
dataset_size: 1096673288.0
---
# Dataset Card for "leicester_loaded_annotations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Juan-ai/contrato-arriendo | ---
license: openrail
---
|
9wimu9/eli5_mult_answers_en | ---
dataset_info:
features:
- name: question
dtype: string
- name: contexts
sequence: string
- name: gold_answer
dtype: string
splits:
- name: train
num_bytes: 370188345.3824035
num_examples: 71236
- name: test
num_bytes: 41136657.61759652
num_examples: 7916
download_size: 248739104
dataset_size: 411325003.0
---
# Dataset Card for "eli5_mult_answers_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coastalcph/fm-updates-falcon-7b | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query
struct:
- name: label
dtype: string
- name: objects
list:
- name: aliases
sequence: string
- name: label
dtype: string
- name: qid
dtype: string
- name: qid
dtype: string
- name: rel_id
dtype: string
- name: relation
dtype: string
- name: prediction
struct:
- name: predictions
list:
- name: answer
dtype: string
- name: first_token_probability
dtype: float64
- name: per_token_probability
sequence: float64
- name: perplexity
dtype: float64
- name: query
dtype: string
- name: f1
dtype: float64
- name: relation
dtype: string
- name: type
dtype: string
- name: original_answer
dtype: string
- name: updates
sequence: string
splits:
- name: test
num_bytes: 386289.64365440264
num_examples: 408
download_size: 292243
dataset_size: 386289.64365440264
---
# Dataset Card for "fm-updates-falcon-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kiitunp/MarieFranceLabelle | ---
license: mit
---
|
ibivibiv/alpaca_tiny17 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 460198633
num_examples: 290901
download_size: 266018126
dataset_size: 460198633
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FaalSa/data13 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17310
num_examples: 1
- name: validation
num_bytes: 17790
num_examples: 1
- name: test
num_bytes: 18270
num_examples: 1
download_size: 8204
dataset_size: 53370
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_possessives_for_pre | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 6022
num_examples: 26
- name: test
num_bytes: 35745
num_examples: 123
- name: train
num_bytes: 61800
num_examples: 298
download_size: 39900
dataset_size: 103567
---
# Dataset Card for "MULTI_VALUE_wnli_possessives_for_pre"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingface/autotrain-data-0vgl-4na3-ghnv | Invalid username or password. |
IIC/AQuAS | ---
language:
- es
tags:
- spanish
multilinguality:
- monolingual
task_categories:
- question-answering
task_ids:
- abstractive-qa
- extractive-qa
license:
- cc-by-nc-sa-4.0
pretty_name: AQuAS
---
# Abstractive Question-Answering in Spanish (AQuAS) Dataset
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Leaderboard:** [Leaderboard Somos600M]()
-
- **Point of Contact:** [Contacto]()
### Dataset Summary
AQuAS es un dataset de alta calidad con ejemplos en varios dominios:
| dominio | count |
|:-----------|-----------:|
| financiero | 12 |
| seguros | 20 |
| clínico | 58 |
| música | 6 |
| legal | 11 |
### Supported Tasks and Leaderboards
Está diseñado para evaluar modelos de lenguaje en la tarea de Question-Answering Abstractivo. También puede utlizarse para entrenar de forma supervisada estos modelos.
### Languages
Castellano (BCP-47 es).
## Dataset Structure
### Data Instances
Las instancias de este dataset tienen la siguiente estructura:
| context | question | answer | topic |
|-------------------------------------------------------------------------------------------|----------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------|
| Estos préstamos, como se ha dicho, tienen para la entidad que los concede una garantía... | ¿Para qué sirven los préstamos hipotecarios? | Fundamentalmente sirven para adquirir inmuebles, ya sean viviendas o no, así como para su rehabilitación. En otros casos pueden servir para aumentar el dinero del que disponemos para adquirir bienes de consumo o para reestructurar varias deudas vigentes en un solo préstamo. | financial |
### Data Fields
- **context:** contexto donde podría encontrarse la respuesta.
- **question:** pregunta planteada. Existen preguntas que no pueden responderse en base al contexto.
- **answer:** respuesta redactada a partir del contexto.
- **topic:** el dominio sobre el que trata el ejemplo.
Cabe mencionar que los contextos son cadenas de caracteres extensas.
### Data Splits
El dataset no está dividido en train, validation y test porque está diseñado para evaluar.
| | train |
|-------------------------|------:|
| Input Sentences | 107 |
## Dataset Creation
### Curation Rationale
Los modelos de lenguaje han demostrado grandes capacidades para resolver tareas por medio del entrenamiento de instrucciones. Dichas instrucciones son de diferente naturaleza, tales como: resumir, clasificar, traducir, etc. El Question-Answering Abstractivo en una tarea fundamental a la hora de diseñar modelos con capacidades para responder a preguntas complejas sobre los contextos dados en las que no se limitan a extraer la información literal del mismo, ya que la respuesta necesita de cierta comprensión del contenido.
### Source Data
Los datos se crearon a partir de texto simple extraído de la web, con información de los distintos dominios.
#### Initial Data Collection and Normalization
Para la recolección de los datos se hizo una selección de los textos a partir los dominios elegidos, a los que posteriormente se les extrayeron unos contextos con los que formular las preguntas y respuestas. Se dio mucha importancia al hecho de que los contextos debían ser extensos.
#### Who are the source language producers?
Todo el corpus ha sido generado y revisado por humanos.
### Annotations
La guía de anotación consistió en generar pares de pregunta-respuesta dado un contexto.
#### Annotation process
La metodología de corpus ha consistido en el acuerdo y diseño de las preguntas a realizar sobre los datos y la resolución de dudas.
#### Who are the annotators?
Corpus realizados de forma manual por dos lingüistas computacionales. Las respuestas han sido escritas por cada anotador.
### Personal and Sensitive Information
El dataset está libre de información personal y sensible.
## Considerations for Using the Data
### Social Impact of Dataset
Crear corpus de calidad en castellano es de vital importancia si queremos que la inteligencia artificial de dicho idioma esté a la altura del inglés. La donación de corpus de alta calidad con tareas y dominios variados es lo más relevante a la hora de lograr este objetivo.
### Discussion of Biases
No se ha hecho un análisis de sesgo, por lo que pueden existir algunos sesgos a causa del origen del que provienen los contextos seleccionados.
### Other Known Limitations
Existen sesgos a nivel de dominio, ya que solo se han reflejado cinco a la hora de generar el dataset.
## Additional Information
### Dataset Curators
[Instituto de Ingeniería del Conocimiento](https://www.iic.uam.es/iic/) (IIC).
### Licensing Information
Este dataset está bajo la licencia de uso no comercial [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
### Citation Information
```
@misc {Instituto de Ingeniería del Conocimiento (IIC),
author = { {Instituto de Ingeniería del Conocimiento} },
title = { Abstractive Question-Answering in Spanish (AQuAS) Dataset },
year = 2024,
url = { https://huggingface.co/datasets/IIC/AQuAS },
doi = { 10.57967/hf/2043 },
publisher = { Hugging Face }
}
```
### Contributions
Gracias a [@mariagrandury](https://huggingface.co/mariagrandury) por darnos la oportunidad de participar en la creación de un corpus de instrucciones en castellano y lenguas cooficiales para potenciar los modelos de inteligencia artificial en estos idiomas tan ricos, variados y de tanta relevancia.
|
WolfMK/Samples | ---
license: openrail
---
|
korexyz/celeba-hq-256x256 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': female
'1': male
splits:
- name: train
num_bytes: 2769669459.0
num_examples: 28000
- name: validation
num_bytes: 194637196.0
num_examples: 2000
download_size: 2964490639
dataset_size: 2964306655.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# CelebA-HQ-256x256
CelebA-HQ at 256x256 resolution.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@article{DBLP:journals/corr/abs-1710-10196,
title={Progressive Growing of GANs for Improved Quality, Stability, and Variation},
author={Tero Karras and Timo Aila and Samuli Laine and Jaakko Lehtinen},
year=2017,
journal={CoRR},
volume={abs/1710.10196}
}
``` |
aintech/vdf_medium_articles |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
ro_sent | ---
annotations_creators:
- found
language_creators:
- found
language:
- ro
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: RoSent
dataset_info:
features:
- name: original_id
dtype: string
- name: id
dtype: string
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: train
num_bytes: 8367687
num_examples: 17941
- name: test
num_bytes: 6837430
num_examples: 11005
download_size: 14700057
dataset_size: 15205117
---
# Dataset Card for RoSent
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [GitHub](https://github.com/dumitrescustefan/Romanian-Transformers/tree/examples/examples/sentiment_analysis)
- **Repository:** [GitHub](https://github.com/dumitrescustefan/Romanian-Transformers/tree/examples/examples/sentiment_analysis)
- **Paper:** [arXiv preprint](https://arxiv.org/pdf/2009.08712.pdf)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is a Romanian Sentiment Analysis dataset. It is present in a processed form, as used by the authors of [`Romanian Transformers`](https://github.com/dumitrescustefan/Romanian-Transformers) in their examples and based on the original data present in at [this GitHub repository](https://github.com/katakonst/sentiment-analysis-tensorflow). The original data contains product and movie reviews in Romanian.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
This dataset is present in Romanian language.
## Dataset Structure
### Data Instances
An instance from the `train` split:
```
{'id': '0', 'label': 1, 'original_id': '0', 'sentence': 'acest document mi-a deschis cu adevarat ochii la ceea ce oamenii din afara statelor unite s-au gandit la atacurile din 11 septembrie. acest film a fost construit in mod expert si prezinta acest dezastru ca fiind mai mult decat un atac asupra pamantului american. urmarile acestui dezastru sunt previzionate din multe tari si perspective diferite. cred ca acest film ar trebui sa fie mai bine distribuit pentru acest punct. de asemenea, el ajuta in procesul de vindecare sa vada in cele din urma altceva decat stirile despre atacurile teroriste. si unele dintre piese sunt de fapt amuzante, dar nu abuziv asa. acest film a fost extrem de recomandat pentru mine, si am trecut pe acelasi sentiment.'}
```
### Data Fields
- `original_id`: a `string` feature containing the original id from the file.
- `id`: a `string` feature .
- `sentence`: a `string` feature.
- `label`: a classification label, with possible values including `negative` (0), `positive` (1).
### Data Splits
This dataset has two splits: `train` with 17941 examples, and `test` with 11005 examples.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The source dataset is present at the [this GitHub repository](https://github.com/katakonst/sentiment-analysis-tensorflow) and is based on product and movie reviews. The original source is unknown.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Stefan Daniel Dumitrescu, Andrei-Marious Avram, Sampo Pyysalo, [@katakonst](https://github.com/katakonst)
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{dumitrescu2020birth,
title={The birth of Romanian BERT},
author={Dumitrescu, Stefan Daniel and Avram, Andrei-Marius and Pyysalo, Sampo},
journal={arXiv preprint arXiv:2009.08712},
year={2020}
}
```
### Contributions
Thanks to [@gchhablani](https://github.com/gchhablani) and [@iliemihai](https://github.com/iliemihai) for adding this dataset. |
salim-ingram/philosophy_quotes | ---
license: wtfpl
---
|
Unfaithful/Generationtr | ---
license: creativeml-openrail-m
---
|
Kaue123456/Andrey | ---
license: openrail
---
|
neulab/docprompting-conala | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- mit
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: DocPrompting-CoNaLa
tags:
- code-generation
- doc retrieval
- retrieval augmented generation
---
## Dataset Description
- **Repository:** https://github.com/shuyanzhou/docprompting
- **Paper:** [DocPrompting: Generating Code by Retrieving the Docs](https://arxiv.org/pdf/2207.05987.pdf)
### Dataset Summary
This is the re-split of [CoNaLa](https://conala-corpus.github.io/) dataset.
For each code snippet in the dev and test set, at least one function is held out from the training set.
This split aims at testing a code generation model's capacity in generating *unseen* functions
We further make sure that examples from the same StackOverflow post (same `question_id` before `-`) are in the same split.
### Supported Tasks and Leaderboards
This dataset is used to evaluate code generations.
### Languages
English - Python code.
## Dataset Structure
```python
dataset = load_dataset("neulab/docpromting-conala")
DatasetDict({
train: Dataset({
features: ['nl', 'cmd', 'question_id', 'cmd_name', 'oracle_man', 'canonical_cmd'],
num_rows: 2135
})
test: Dataset({
features: ['nl', 'cmd', 'question_id', 'cmd_name', 'oracle_man', 'canonical_cmd'],
num_rows: 543
})
validation: Dataset({
features: ['nl', 'cmd', 'question_id', 'cmd_name', 'oracle_man', 'canonical_cmd'],
num_rows: 201
})
})
})
code_docs = load_dataset("neulab/docprompting-conala", "docs")
DatasetDict({
train: Dataset({
features: ['doc_id', 'doc_content'],
num_rows: 34003
})
})
```
### Data Fields
train/dev/test:
- nl: The natural language intent
- cmd: The reference code snippet
- question_id: `x-y`where `x` is the StackOverflow post ID
- oracle_man: The `doc_id` of the functions used in the reference code snippet. The corresponding contents are in `doc` split
- canonical_cmd: The canonical version reference code snippet
docs:
- doc_id: the id of a doc
- doc_content: the content of the doc
## Dataset Creation
The dataset was crawled from Stack Overflow, automatically filtered, then curated by annotators. For more details, please refer to the original [paper](https://arxiv.org/pdf/1805.08949.pdf)
### Citation Information
```
@article{zhou2022doccoder,
title={DocCoder: Generating Code by Retrieving and Reading Docs},
author={Zhou, Shuyan and Alon, Uri and Xu, Frank F and JIang, Zhengbao and Neubig, Graham},
journal={arXiv preprint arXiv:2207.05987},
year={2022}
}
``` |
Baidicoot/augmented_advbench_v3 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion_1
dtype: string
- name: completion_2
dtype: string
- name: completion_3
dtype: string
- name: completion_4
dtype: string
- name: completion_5
dtype: string
- name: refusal
dtype: string
- name: refusal_generic
dtype: string
splits:
- name: train
num_bytes: 12999454
num_examples: 4948
download_size: 6636005
dataset_size: 12999454
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
djaekim/inversion-mutation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: type
dtype: string
- name: dataset
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8482395
num_examples: 1965
download_size: 2794490
dataset_size: 8482395
---
# Dataset Card for "inversion-mutation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-tweet_eval-offensive-736f56-30712144947 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: elozano/tweet_offensive_eval
metrics: ['bertscore']
dataset_name: tweet_eval
dataset_config: offensive
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: elozano/tweet_offensive_eval
* Dataset: tweet_eval
* Config: offensive
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@fabeelaalirawther@gmail.com](https://huggingface.co/fabeelaalirawther@gmail.com) for evaluating this model. |
sankettgorey/layouts_donut_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 1107628464.82414
num_examples: 4007
- name: test
num_bytes: 136074844.03892994
num_examples: 501
- name: validation
num_bytes: 139076925.03892994
num_examples: 501
download_size: 1146273186
dataset_size: 1382780233.902
---
# Dataset Card for "layouts_donut_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-logical_fallacies-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6380
num_examples: 5
- name: test
num_bytes: 460595
num_examples: 163
download_size: 13153
dataset_size: 466975
---
# Dataset Card for "mmlu-logical_fallacies-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
megantron/simpsons_captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 13735625.0
num_examples: 200
download_size: 13637896
dataset_size: 13735625.0
---
# Dataset Card for "simpsons_captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hg8888888888/8888 | ---
license: openrail
---
|
NativeFunction/taxi-fare-test | ---
dataset_info:
features:
- name: key
dtype: string
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: train
num_bytes: 977751
num_examples: 9914
download_size: 521219
dataset_size: 977751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713020737 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11111
num_examples: 26
download_size: 8912
dataset_size: 11111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713020737"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mountain_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mountain_arknights
This is the dataset of mountain_arknights, containing 138 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 138 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 306 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 138 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 138 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 138 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 138 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 138 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 306 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 306 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 306 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
HorcruxNo13/toolwear_segmentsai | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 44532017.0
num_examples: 27
download_size: 4527506
dataset_size: 44532017.0
---
# Dataset Card for "toolwear_segmentsai"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-zeroshot__twitter-financial-news-topic-zeroshot__twitte-e590a9-28983144931 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- zeroshot/twitter-financial-news-topic
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['bertscore']
dataset_name: zeroshot/twitter-financial-news-topic
dataset_config: zeroshot--twitter-financial-news-topic
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: zeroshot/twitter-financial-news-topic
* Config: zeroshot--twitter-financial-news-topic
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@peterdevathala](https://huggingface.co/peterdevathala) for evaluating this model. |
Cyberelay/nebula_ghostbusters | ---
license: openrail
task_categories:
- text-to-image
language:
- en
---
Test Only |
HuggingFaceM4/SNLI-VE | ---
license: bsd-3-clause
---
|
Ranjan22/Marvel_Characters_Face_Data | ---
license: odc-by
---
|
HuggingFaceH4/spin-ultrachat-prompts-qwen-1.5-0.5b-iter0-iter1 | ---
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
- name: prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 573624231.0
num_examples: 99584
- name: test
num_bytes: 5727702.0
num_examples: 1000
download_size: 328129611
dataset_size: 579351933.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Cognitive-Lab/Aya_Marathi | ---
dataset_info:
- config_name: complete_dataset
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 4119380566
num_examples: 3575683
download_size: 1356608562
dataset_size: 4119380566
- config_name: templated_indic_paraphrase
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 773026
num_examples: 1001
download_size: 255328
dataset_size: 773026
- config_name: templated_indic_sentiment
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 745782
num_examples: 1156
download_size: 307088
dataset_size: 745782
- config_name: templated_xlel_wd
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 1429815
num_examples: 1161
download_size: 503445
dataset_size: 1429815
- config_name: translated_adversarial_qa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 22931274
num_examples: 10000
download_size: 5791791
dataset_size: 22931274
- config_name: translated_cnn_dailymail
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 612614699
num_examples: 100000
download_size: 225268596
dataset_size: 612614699
- config_name: translated_dolly
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 30988209
num_examples: 14808
download_size: 12027773
dataset_size: 30988209
- config_name: translated_flan_coqa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 40235091
num_examples: 6409
download_size: 15430700
dataset_size: 40235091
- config_name: translated_flan_cot
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 98331455
num_examples: 91910
download_size: 34295182
dataset_size: 98331455
- config_name: translated_flan_gem_wiki
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 160449052
num_examples: 27147
download_size: 58344118
dataset_size: 160449052
- config_name: translated_flan_lambada
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 2885792
num_examples: 4279
download_size: 1068206
dataset_size: 2885792
- config_name: translated_flan_qa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 433734
num_examples: 540
download_size: 154930
dataset_size: 433734
- config_name: translated_hotpotqa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 171545509
num_examples: 355476
download_size: 51033087
dataset_size: 171545509
- config_name: translated_joke_explaination
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 1334320
num_examples: 754
download_size: 268958
dataset_size: 1334320
- config_name: translated_mintaka
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 5691487
num_examples: 14000
download_size: 989653
dataset_size: 5691487
- config_name: translated_nqopen
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 53241715
num_examples: 175850
download_size: 15297113
dataset_size: 53241715
- config_name: translated_paws
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 44574443
num_examples: 49401
download_size: 6122839
dataset_size: 44574443
- config_name: translated_piqa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 17321849
num_examples: 16113
download_size: 5006389
dataset_size: 17321849
- config_name: translated_soda
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 977442017
num_examples: 1191582
download_size: 283089235
dataset_size: 977442017
- config_name: translated_wiki_split
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 1019477318
num_examples: 989944
download_size: 319167021
dataset_size: 1019477318
- config_name: translated_wikiqa
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 740836
num_examples: 1040
download_size: 266979
dataset_size: 740836
- config_name: translated_xlel_wd
features:
- name: targets
dtype: string
- name: task_type
dtype: string
- name: id
dtype: int64
- name: template_id
dtype: int64
- name: dataset_name
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: sub_dataset_name
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 856193143
num_examples: 523112
download_size: 321169799
dataset_size: 856193143
configs:
- config_name: complete_dataset
data_files:
- split: train
path: complete_dataset/train-*
- config_name: templated_indic_paraphrase
data_files:
- split: train
path: templated_indic_paraphrase/train-*
- config_name: templated_indic_sentiment
data_files:
- split: train
path: templated_indic_sentiment/train-*
- config_name: templated_xlel_wd
data_files:
- split: train
path: templated_xlel_wd/train-*
- config_name: translated_adversarial_qa
data_files:
- split: train
path: translated_adversarial_qa/train-*
- config_name: translated_cnn_dailymail
data_files:
- split: train
path: translated_cnn_dailymail/train-*
- config_name: translated_dolly
data_files:
- split: train
path: translated_dolly/train-*
- config_name: translated_flan_coqa
data_files:
- split: train
path: translated_flan_coqa/train-*
- config_name: translated_flan_cot
data_files:
- split: train
path: translated_flan_cot/train-*
- config_name: translated_flan_gem_wiki
data_files:
- split: train
path: translated_flan_gem_wiki/train-*
- config_name: translated_flan_lambada
data_files:
- split: train
path: translated_flan_lambada/train-*
- config_name: translated_flan_qa
data_files:
- split: train
path: translated_flan_qa/train-*
- config_name: translated_hotpotqa
data_files:
- split: train
path: translated_hotpotqa/train-*
- config_name: translated_joke_explaination
data_files:
- split: train
path: translated_joke_explaination/train-*
- config_name: translated_mintaka
data_files:
- split: train
path: translated_mintaka/train-*
- config_name: translated_nqopen
data_files:
- split: train
path: translated_nqopen/train-*
- config_name: translated_paws
data_files:
- split: train
path: translated_paws/train-*
- config_name: translated_piqa
data_files:
- split: train
path: translated_piqa/train-*
- config_name: translated_soda
data_files:
- split: train
path: translated_soda/train-*
- config_name: translated_wiki_split
data_files:
- split: train
path: translated_wiki_split/train-*
- config_name: translated_wikiqa
data_files:
- split: train
path: translated_wikiqa/train-*
- config_name: translated_xlel_wd
data_files:
- split: train
path: translated_xlel_wd/train-*
---
|
Anusha64/AeonDataset | ---
license: mit
---
|
CVasNLPExperiments/fairness_firefighter_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_4800 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: scores
sequence: float64
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2480232
num_examples: 4800
download_size: 179504
dataset_size: 2480232
---
# Dataset Card for "fairness_firefighter_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_4800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/lj-inprogress-2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
sequence: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 15192445537
num_examples: 13100
download_size: 3747503561
dataset_size: 15192445537
---
# Dataset Card for "lj-inprogress-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rami/prompts_eval | ---
dataset_info:
features:
- name: prompt_generator
dtype: string
- name: chat_gpt_response
dtype: float64
- name: instructions
dtype: string
- name: temperature
dtype: float64
- name: repetition_penalty
dtype: float64
- name: top_p
dtype: float64
- name: reference_text
dtype: string
splits:
- name: train
num_bytes: 931610
num_examples: 390
download_size: 420124
dataset_size: 931610
---
# Dataset Card for "prompts_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dvilasuero/tweets_for_labelling | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 3480.269230769231
num_examples: 41
- name: test
num_bytes: 933.7307692307693
num_examples: 11
download_size: 7108
dataset_size: 4414.0
---
# Dataset Card for "tweets_for_labelling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/36e1d427 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 232
num_examples: 10
download_size: 1385
dataset_size: 232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "36e1d427"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/flickr30k | ---
license: other
---
|
open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B | ---
pretty_name: Evaluation run of teknium/Mistral-Trismegistus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/Mistral-Trismegistus-7B](https://huggingface.co/teknium/Mistral-Trismegistus-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T09:46:08.723071](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B/blob/main/results_2023-10-25T09-46-08.723071.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.010591442953020135,\n\
\ \"em_stderr\": 0.0010483469790502314,\n \"f1\": 0.07238674496644287,\n\
\ \"f1_stderr\": 0.001675223530701393,\n \"acc\": 0.4004875617305928,\n\
\ \"acc_stderr\": 0.010548628211357203\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.010591442953020135,\n \"em_stderr\": 0.0010483469790502314,\n\
\ \"f1\": 0.07238674496644287,\n \"f1_stderr\": 0.001675223530701393\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7016574585635359,\n \"acc_stderr\": 0.012858885010030421\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/Mistral-Trismegistus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|drop|3_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T09-46-08.723071.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-46-08.723071.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|winogrande|5_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T09-46-08.723071.parquet'
- config_name: results
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- results_2023-10-12T08-45-24.509522.parquet
- split: 2023_10_25T09_46_08.723071
path:
- results_2023-10-25T09-46-08.723071.parquet
- split: latest
path:
- results_2023-10-25T09-46-08.723071.parquet
---
# Dataset Card for Evaluation run of teknium/Mistral-Trismegistus-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/Mistral-Trismegistus-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/Mistral-Trismegistus-7B](https://huggingface.co/teknium/Mistral-Trismegistus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T09:46:08.723071](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B/blob/main/results_2023-10-25T09-46-08.723071.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502314,
"f1": 0.07238674496644287,
"f1_stderr": 0.001675223530701393,
"acc": 0.4004875617305928,
"acc_stderr": 0.010548628211357203
},
"harness|drop|3": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502314,
"f1": 0.07238674496644287,
"f1_stderr": 0.001675223530701393
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683985
},
"harness|winogrande|5": {
"acc": 0.7016574585635359,
"acc_stderr": 0.012858885010030421
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yuwan0/laion_with_images | ---
dataset_info:
features:
- name: image
dtype: string
- name: text
dtype: string
- name: tag
dtype: int64
splits:
- name: train
num_bytes: 258133826
num_examples: 994
download_size: 174379018
dataset_size: 258133826
---
# Dataset Card for "laion_with_images"
选择了1000张图像,使用Clip为每一张图像进行字幕生成。 |
tollefj/nor-instruct | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 33165183
num_examples: 67714
- name: test
num_bytes: 324411
num_examples: 684
download_size: 20779304
dataset_size: 33489594
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
A concatenated instruction-based dataset from the following:
- NbAiLab/norwegian-alpaca
- RuterNorway/Fleurs-Alpaca-EN-NO
- RuterNorway/OpenOrcaNo-15k |
AkikoOu/beijingopera-trainorigin | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 343446.0
num_examples: 9
download_size: 342528
dataset_size: 343446.0
---
# Dataset Card for "beijingopera-trainorigin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexusflow/VT_MultiAPIs | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: args_dicts
list:
- name: default
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: required
dtype: bool
- name: type
dtype: string
- name: api_type
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 20764
num_examples: 29
download_size: 14860
dataset_size: 20764
---
# Dataset Card for "new_vt_apis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taesiri/GameplayCaptions-GPT-4V-V2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: game_name
dtype: string
- name: gpt-4v-response
dtype: string
- name: youtube_video_id
dtype: string
- name: category
dtype: string
- name: chat_history
sequence:
sequence: string
- name: chat_history_json
list:
- name: authorRole
dtype: string
- name: content
struct:
- name: messageImages
list:
- name: alt
dtype: string
- name: height
dtype: int64
- name: src
dtype: string
- name: width
dtype: int64
- name: textContent
dtype: string
- name: messageId
dtype: string
- name: file_path
dtype: string
splits:
- name: train
num_bytes: 32097429961.114
num_examples: 70799
download_size: 31632666082
dataset_size: 32097429961.114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
doon-D/mydon-db | ---
license: mit
---
|
joey234/rotten_tomatoes_affix | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
sequence: string
splits:
- name: test
num_bytes: 32292
num_examples: 194
download_size: 24662
dataset_size: 32292
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "rotten_tomatoes_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of mistralai/Mistral-7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T09:43:48.997990](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-24T09-43-48.997990.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37038590604026844,\n\
\ \"em_stderr\": 0.00494543044549648,\n \"f1\": 0.43100566275167973,\n\
\ \"f1_stderr\": 0.00478990485809286,\n \"acc\": 0.4398533245809979,\n\
\ \"acc_stderr\": 0.01100025548646791\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.37038590604026844,\n \"em_stderr\": 0.00494543044549648,\n\
\ \"f1\": 0.43100566275167973,\n \"f1_stderr\": 0.00478990485809286\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \
\ \"acc_stderr\": 0.009629588445673814\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262006\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T09_43_48.997990
path:
- '**/details_harness|drop|3_2023-10-24T09-43-48.997990.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T09-43-48.997990.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T09_43_48.997990
path:
- '**/details_harness|gsm8k|5_2023-10-24T09-43-48.997990.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T09-43-48.997990.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T06-38-48.353025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T09_43_48.997990
path:
- '**/details_harness|winogrande|5_2023-10-24T09-43-48.997990.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T09-43-48.997990.parquet'
- config_name: results
data_files:
- split: 2023_10_10T06_38_48.353025
path:
- results_2023-10-10T06-38-48.353025.parquet
- split: 2023_10_24T09_43_48.997990
path:
- results_2023-10-24T09-43-48.997990.parquet
- split: latest
path:
- results_2023-10-24T09-43-48.997990.parquet
---
# Dataset Card for Evaluation run of mistralai/Mistral-7B-Instruct-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mistralai/Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T09:43:48.997990](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-Instruct-v0.1/blob/main/results_2023-10-24T09-43-48.997990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.37038590604026844,
"em_stderr": 0.00494543044549648,
"f1": 0.43100566275167973,
"f1_stderr": 0.00478990485809286,
"acc": 0.4398533245809979,
"acc_stderr": 0.01100025548646791
},
"harness|drop|3": {
"em": 0.37038590604026844,
"em_stderr": 0.00494543044549648,
"f1": 0.43100566275167973,
"f1_stderr": 0.00478990485809286
},
"harness|gsm8k|5": {
"acc": 0.1425322213798332,
"acc_stderr": 0.009629588445673814
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262006
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
embedding-data/coco_captions_quintets | ---
license: mit
language:
- en
paperswithcode_id: embedding-data/coco_captions
pretty_name: coco_captions
task_categories:
- sentence-similarity
- paraphrase-mining
task_ids:
- semantic-similarity-classification
---
# Dataset Card for "coco_captions"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://cocodataset.org/#home](https://cocodataset.org/#home)
- **Repository:** [https://github.com/cocodataset/cocodataset.github.io](https://github.com/cocodataset/cocodataset.github.io)
- **Paper:** [More Information Needed](https://arxiv.org/abs/1405.0312)
- **Point of Contact:** [info@cocodataset.org](info@cocodataset.org)
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:** 6.32 MB
### Dataset Summary
COCO is a large-scale object detection, segmentation, and captioning dataset. This repo contains five captions per image; useful for sentence similarity tasks.
Disclaimer: The team releasing COCO did not upload the dataset to the Hub and did not write a dataset card.
These steps were done by the Hugging Face team.
### Supported Tasks
- [Sentence Transformers](https://huggingface.co/sentence-transformers) training; useful for semantic search and sentence similarity.
### Languages
- English.
## Dataset Structure
Each example in the dataset contains quintets of similar sentences and is formatted as a dictionary with the key "set" and a list with the sentences as "value":
```
{"set": [sentence_1, sentence_2, sentence3, sentence4, sentence5]}
{"set": [sentence_1, sentence_2, sentence3, sentence4, sentence5]}
...
{"set": [sentence_1, sentence_2, sentence3, sentence4, sentence5]}
```
This dataset is useful for training Sentence Transformers models. Refer to the following post on how to train models using similar pairs of sentences.
### Usage Example
Install the 🤗 Datasets library with `pip install datasets` and load the dataset from the Hub with:
```python
from datasets import load_dataset
dataset = load_dataset("embedding-data/coco_captions")
```
The dataset is loaded as a `DatasetDict` and has the format:
```python
DatasetDict({
train: Dataset({
features: ['set'],
num_rows: 82783
})
})
```
Review an example `i` with:
```python
dataset["train"][i]["set"]
```
### Data Instances
[More Information Needed](https://cocodataset.org/#format-data)
### Data Splits
[More Information Needed](https://cocodataset.org/#format-data)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://cocodataset.org/#home)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://cocodataset.org/#home)
#### Who are the source language producers?
[More Information Needed](https://cocodataset.org/#home)
### Annotations
#### Annotation process
[More Information Needed](https://cocodataset.org/#home)
#### Who are the annotators?
[More Information Needed](https://cocodataset.org/#home)
### Personal and Sensitive Information
[More Information Needed](https://cocodataset.org/#home)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://cocodataset.org/#home)
### Discussion of Biases
[More Information Needed](https://cocodataset.org/#home)
### Other Known Limitations
[More Information Needed](https://cocodataset.org/#home)
## Additional Information
### Dataset Curators
[More Information Needed](https://cocodataset.org/#home)
### Licensing Information
The annotations in this dataset along with this website belong to the COCO Consortium
and are licensed under a [Creative Commons Attribution 4.0 License](https://creativecommons.org/licenses/by/4.0/legalcode)
### Citation Information
[More Information Needed](https://cocodataset.org/#home)
### Contributions
Thanks to:
- Tsung-Yi Lin - Google Brain
- Genevieve Patterson - MSR, Trash TV
- Matteo R. - Ronchi Caltech
- Yin Cui - Google
- Michael Maire - TTI-Chicago
- Serge Belongie - Cornell Tech
- Lubomir Bourdev - WaveOne, Inc.
- Ross Girshick - FAIR
- James Hays - Georgia Tech
- Pietro Perona - Caltech
- Deva Ramanan - CMU
- Larry Zitnick - FAIR
- Piotr Dollár - FAIR
for adding this dataset.
|
macadeliccc/simpsons-images | ---
language:
- en
license: apache-2.0
size_categories:
- n<1K
pretty_name: The Simpson's Images
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 21033758.0
num_examples: 313
download_size: 20066480
dataset_size: 21033758.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- Animation
- art
---
|
roydcarlson/sidewalk-imagery2 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3138394.0
num_examples: 10
download_size: 3139599
dataset_size: 3138394.0
---
# Dataset Card for "sidewalk-imagery2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_his_him | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1455
num_examples: 18
- name: test
num_bytes: 2227
num_examples: 30
- name: train
num_bytes: 22936
num_examples: 301
download_size: 19095
dataset_size: 26618
---
# Dataset Card for "MULTI_VALUE_cola_his_him"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct | ---
pretty_name: Evaluation run of yhyhy3/open_llama_7b_v2_med_instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yhyhy3/open_llama_7b_v2_med_instruct](https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T23:04:48.092833](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct/blob/main/results_2023-10-14T23-04-48.092833.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119089,\n \"f1\": 0.06285234899328887,\n\
\ \"f1_stderr\": 0.0014049416535996321,\n \"acc\": 0.3567227929125231,\n\
\ \"acc_stderr\": 0.008432051001442599\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119089,\n\
\ \"f1\": 0.06285234899328887,\n \"f1_stderr\": 0.0014049416535996321\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \
\ \"acc_stderr\": 0.003900413385915718\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6929755327545383,\n \"acc_stderr\": 0.01296368861696948\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T23_04_48.092833
path:
- '**/details_harness|drop|3_2023-10-14T23-04-48.092833.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T23-04-48.092833.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T23_04_48.092833
path:
- '**/details_harness|gsm8k|5_2023-10-14T23-04-48.092833.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T23-04-48.092833.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:52:38.098362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T11:52:38.098362.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T23_04_48.092833
path:
- '**/details_harness|winogrande|5_2023-10-14T23-04-48.092833.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T23-04-48.092833.parquet'
- config_name: results
data_files:
- split: 2023_07_24T11_52_38.098362
path:
- results_2023-07-24T11:52:38.098362.parquet
- split: 2023_10_14T23_04_48.092833
path:
- results_2023-10-14T23-04-48.092833.parquet
- split: latest
path:
- results_2023-10-14T23-04-48.092833.parquet
---
# Dataset Card for Evaluation run of yhyhy3/open_llama_7b_v2_med_instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yhyhy3/open_llama_7b_v2_med_instruct](https://huggingface.co/yhyhy3/open_llama_7b_v2_med_instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T23:04:48.092833](https://huggingface.co/datasets/open-llm-leaderboard/details_yhyhy3__open_llama_7b_v2_med_instruct/blob/main/results_2023-10-14T23-04-48.092833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119089,
"f1": 0.06285234899328887,
"f1_stderr": 0.0014049416535996321,
"acc": 0.3567227929125231,
"acc_stderr": 0.008432051001442599
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119089,
"f1": 0.06285234899328887,
"f1_stderr": 0.0014049416535996321
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.003900413385915718
},
"harness|winogrande|5": {
"acc": 0.6929755327545383,
"acc_stderr": 0.01296368861696948
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
multi-train/emb-trex-train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: idx
dtype: int64
- name: task_name
dtype: string
splits:
- name: train
num_bytes: 2175572925
num_examples: 2284168
download_size: 1321673983
dataset_size: 2175572925
---
# Dataset Card for "emb-trex-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nyuuzyou/3dnews-articles | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: 3DNews Articles
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for 3DNews Articles
### Dataset Summary
The dataset comprises news articles from the Russian technology website [3DNews](https://3dnews.ru), covering the period from 2003 to 2024. It covers the latest updates in the world of digital technology and insightful commentary from industry experts, spanning the years 2003 to 2024.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `id`: A unique identifier for each item (integer)
- `title`: The title of the article or news item (string)
- `content`: The main content of the article or news item. This is a string value that may contain multiple paragraphs and special characters.
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
|
nilq/baby-python-and-tiny-stories-and-lua | ---
dataset_info:
features:
- name: content
dtype: string
- name: origin
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 4471145457
num_examples: 11198732
- name: validation
num_bytes: 482322816
num_examples: 1093175
download_size: 1794211908
dataset_size: 4953468273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.