datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Ezi/test_Up | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
|
g-ronimo/oasst2_top1_en | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 10491824
num_examples: 5419
download_size: 5658552
dataset_size: 10491824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
---
# Dataset Card for "oasst2_top1_en"
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* language-filtered: en
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py |
open-llm-leaderboard/details_vicgalle__SystemHermes-2-7B | ---
pretty_name: Evaluation run of vicgalle/SystemHermes-2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vicgalle/SystemHermes-2-7B](https://huggingface.co/vicgalle/SystemHermes-2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__SystemHermes-2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T11:03:07.781697](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__SystemHermes-2-7B/blob/main/results_2024-03-21T11-03-07.781697.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6338248776795873,\n\
\ \"acc_stderr\": 0.03237220063508142,\n \"acc_norm\": 0.6354461960257087,\n\
\ \"acc_norm_stderr\": 0.03302051993311256,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5641556907529288,\n\
\ \"mc2_stderr\": 0.015439070980915756\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.0141633668961926,\n\
\ \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158296\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6506671977693687,\n\
\ \"acc_stderr\": 0.0047578490234119605,\n \"acc_norm\": 0.8404700258912567,\n\
\ \"acc_norm_stderr\": 0.003654212329516619\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n\
\ \"acc_stderr\": 0.015491756531894638,\n \"acc_norm\": 0.311731843575419,\n\
\ \"acc_norm_stderr\": 0.015491756531894638\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5641556907529288,\n\
\ \"mc2_stderr\": 0.015439070980915756\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698327\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6156178923426838,\n \
\ \"acc_stderr\": 0.013399219253698184\n }\n}\n```"
repo_url: https://huggingface.co/vicgalle/SystemHermes-2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-03-07.781697.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-03-07.781697.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- '**/details_harness|winogrande|5_2024-03-21T11-03-07.781697.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T11-03-07.781697.parquet'
- config_name: results
data_files:
- split: 2024_03_21T11_03_07.781697
path:
- results_2024-03-21T11-03-07.781697.parquet
- split: latest
path:
- results_2024-03-21T11-03-07.781697.parquet
---
# Dataset Card for Evaluation run of vicgalle/SystemHermes-2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/SystemHermes-2-7B](https://huggingface.co/vicgalle/SystemHermes-2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__SystemHermes-2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T11:03:07.781697](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__SystemHermes-2-7B/blob/main/results_2024-03-21T11-03-07.781697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6338248776795873,
"acc_stderr": 0.03237220063508142,
"acc_norm": 0.6354461960257087,
"acc_norm_stderr": 0.03302051993311256,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5641556907529288,
"mc2_stderr": 0.015439070980915756
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.0141633668961926,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158296
},
"harness|hellaswag|10": {
"acc": 0.6506671977693687,
"acc_stderr": 0.0047578490234119605,
"acc_norm": 0.8404700258912567,
"acc_norm_stderr": 0.003654212329516619
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.037242495958177295,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.037242495958177295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894638,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5641556907529288,
"mc2_stderr": 0.015439070980915756
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698327
},
"harness|gsm8k|5": {
"acc": 0.6156178923426838,
"acc_stderr": 0.013399219253698184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
globis-university/aozorabunko-clean | ---
license: cc-by-4.0
task_categories:
- text-generation
- text-classification
language:
- ja
size_categories:
- 10K<n<100K
---
# Overview
This dataset provides a convenient and user-friendly format of data from [Aozora Bunko (青空文庫)](https://www.aozora.gr.jp/), a website that compiles public-domain books in Japan, ideal for Machine Learning applications.
[For Japanese] 日本語での概要説明を Qiita に記載しました: https://qiita.com/akeyhero/items/b53eae1c0bc4d54e321f
# Methodology
The code to reproduce this dataset is made available on GitHub: [globis-org/aozorabunko-exctractor](https://github.com/globis-org/aozorabunko-extractor).
## 1. Data collection
We firstly downloaded the [CSV file that lists all works](https://www.aozora.gr.jp/index_pages/person_all.html). The information extracted from this CSV is incorporated into the `meta` field.
Next, we filtered out any books not categorized as public domain.
We retrieved the main text of each book corresponding to every row in the CSV and incorporated it into the `text` field in UTF-8.
## 2. Deduplication
We removed entries where the `図書カードURL` (Library card URL) in this CSV did not match with the `作品ID` (Work ID) and `人物ID` (Person ID).
In addition, entries with text identical to previously encountered text were discarded.
## 3. Cleaning
The data in the `text` field was then cleaned in the following sequence:
1. Convert new lines to `\n`
2. Remove headers
3. Remove footnotes and add them to the `footnote` field
4. Convert inserted notes into regular parenthetical text
5. Remove ruby (phonetic guides)
6. Convert specific characters, such as external characters and iteration marks, into standard Unicode characters
7. Remove any remaining markup
8. Remove leading and trailing new lines and horizontal rules
# Tips
If you prefer to employ only modern Japanese, you can filter entries with: `row["meta"]["文字遣い種別"] == "新字新仮名"`.
# Example
```py
>>> from datasets import load_dataset
>>> ds = load_dataset('globis-university/aozorabunko-clean')
>>> ds
DatasetDict({
train: Dataset({
features: ['text', 'footnote', 'meta'],
num_rows: 16951
})
})
>>> ds = ds.filter(lambda row: row['meta']['文字遣い種別'] == '新字新仮名') # only modern Japanese
>>> ds
DatasetDict({
train: Dataset({
features: ['text', 'footnote', 'meta'],
num_rows: 10246
})
})
>>> book = ds['train'][0] # one of the works
>>> book['meta']['作品名']
'ウェストミンスター寺院'
>>> text = book['text'] # main content
>>> len(text)
10639
>>> print(text[:100])
深いおどろきにうたれて、
名高いウェストミンスターに
真鍮や石の記念碑となって
すべての王侯貴族が集まっているのをみれば、
今はさげすみも、ほこりも、見栄もない。
善にかえった貴人の姿、
華美と俗世の
```
# License
CC BY 4.0 |
CyberHarem/li_sushang_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of li_sushang (Houkai 3rd)
This is the dataset of li_sushang (Houkai 3rd), containing 190 images and their tags.
The core tags of this character are `brown_hair, long_hair, bangs, breasts, hair_ornament, brown_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 190 | 359.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 190 | 172.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 456 | 361.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 190 | 301.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 456 | 552.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/li_sushang_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, china_dress, closed_mouth, fingerless_gloves, looking_at_viewer, smile, solo, white_dress, white_gloves, elbow_gloves |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, china_dress, closed_mouth, elbow_gloves, fingerless_gloves, holding_sword, looking_at_viewer, smile, solo, white_dress, white_gloves |
| 2 | 6 |  |  |  |  |  | 1girl, blush, nipples, cum_in_pussy, hetero, solo_focus, multiple_penises, open_mouth, vaginal, 3boys, cum_on_breasts, double_handjob, ejaculation, gangbang, navel, nude, piercing, pubic_hair, pubic_tattoo, spread_legs, thighhighs, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | china_dress | closed_mouth | fingerless_gloves | looking_at_viewer | smile | solo | white_dress | white_gloves | elbow_gloves | holding_sword | blush | nipples | cum_in_pussy | hetero | solo_focus | multiple_penises | open_mouth | vaginal | 3boys | cum_on_breasts | double_handjob | ejaculation | gangbang | navel | nude | piercing | pubic_hair | pubic_tattoo | spread_legs | thighhighs | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:--------------------|:--------------------|:--------|:-------|:--------------|:---------------|:---------------|:----------------|:--------|:----------|:---------------|:---------|:-------------|:-------------------|:-------------|:----------|:--------|:-----------------|:-----------------|:--------------|:-----------|:--------|:-------|:-----------|:-------------|:---------------|:--------------|:-------------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
d0rj/hh-rlhf-ru | ---
language_creators:
- translated
language:
- ru
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
pretty_name: HH for RLHF (ru)
source_datasets:
- Anthropic/hh-rlhf
license: mit
tags:
- human-feedback
- ChatGPT
- reward
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 573845356.0
num_examples: 160800
- name: test
num_bytes: 30792414.0
num_examples: 8552
download_size: 281014419
dataset_size: 604637770.0
---
# Dataset Card for "hh-rlhf-ru"
This is translated version of [Anthropic/hh-rlhf dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf) into Russian.
|
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_350m_Visclues_ns_5647_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 86816473.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 90734475.125
num_examples: 5647
download_size: 169654260
dataset_size: 177550948.25
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_350m_Visclues_ns_5647_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KETI-AIR/KITD_SAMPLE | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: data_name
dtype: string
- name: data_split_name
dtype: string
- name: data_index_by_user
dtype: int32
splits:
- name: FULL
num_bytes: 5331202141
num_examples: 5409719
- name: SAMPLE
num_bytes: 9462273
num_examples: 4242
download_size: 2014165075
dataset_size: 5340664414
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-135000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 661472
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pa-shk/sberquad | ---
dataset_info:
- config_name: docs
features:
- name: doc
dtype: string
splits:
- name: train
num_bytes: 18450007
num_examples: 13489
download_size: 9682240
dataset_size: 18450007
- config_name: qrels
features:
- name: query
dtype: string
- name: relevant_docs
sequence: int64
splits:
- name: train
num_bytes: 6111481
num_examples: 45326
- name: validation
num_bytes: 677073
num_examples: 5036
- name: test
num_bytes: 3226461
num_examples: 23935
download_size: 4620507
dataset_size: 10015015
configs:
- config_name: docs
data_files:
- split: train
path: docs/train-*
- config_name: qrels
data_files:
- split: train
path: qrels/train-*
- split: validation
path: qrels/validation-*
- split: test
path: qrels/test-*
---
|
defog/wikisql | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 5525298
num_examples: 1000
download_size: 761250
dataset_size: 5525298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikisql"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_pendigits | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1143744
num_examples: 7728
- name: validation
num_bytes: 483072
num_examples: 3264
download_size: 1332707
dataset_size: 1626816
---
# Dataset Card for "metatree_pendigits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1712979666 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22965
num_examples: 51
download_size: 12493
dataset_size: 22965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bdice/rapids-codegen | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 378315950
num_examples: 16827
download_size: 151107014
dataset_size: 378315950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rapids-codegen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chanwit/ultrachat_200k_filtered | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train_sft
num_bytes: 1396561957
num_examples: 207646
- name: test_sft
num_bytes: 154634580
num_examples: 23082
download_size: 813170063
dataset_size: 1551196537
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
tner/wikiann | ---
language:
- ace
- bg
- da
- fur
- ilo
- lij
- mzn
- qu
- su
- vi
- af
- bh
- de
- fy
- io
- lmo
- nap
- rm
- sv
- vls
- als
- bn
- diq
- ga
- is
- ln
- nds
- ro
- sw
- vo
- am
- bo
- dv
- gan
- it
- lt
- ne
- ru
- szl
- wa
- an
- br
- el
- gd
- ja
- lv
- nl
- rw
- ta
- war
- ang
- bs
- eml
- gl
- jbo
- nn
- sa
- te
- wuu
- ar
- ca
- en
- gn
- jv
- mg
- no
- sah
- tg
- xmf
- arc
- eo
- gu
- ka
- mhr
- nov
- scn
- th
- yi
- arz
- cdo
- es
- hak
- kk
- mi
- oc
- sco
- tk
- yo
- as
- ce
- et
- he
- km
- min
- or
- sd
- tl
- zea
- ast
- ceb
- eu
- hi
- kn
- mk
- os
- sh
- tr
- ay
- ckb
- ext
- hr
- ko
- ml
- pa
- si
- tt
- az
- co
- fa
- hsb
- ksh
- mn
- pdc
- ug
- ba
- crh
- fi
- hu
- ku
- mr
- pl
- sk
- uk
- zh
- bar
- cs
- hy
- ky
- ms
- pms
- sl
- ur
- csb
- fo
- ia
- la
- mt
- pnb
- so
- uz
- cv
- fr
- id
- lb
- mwl
- ps
- sq
- vec
- be
- cy
- frr
- ig
- li
- my
- pt
- sr
multilinguality:
- multilingual
size_categories:
- 10K<100k
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: WikiAnn
---
# Dataset Card for "tner/wikiann"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/P17-1178/](https://aclanthology.org/P17-1178/)
- **Dataset:** WikiAnn
- **Domain:** Wikipedia
- **Number of Entity:** 3
### Dataset Summary
WikiAnn NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
- Entity Types: `LOC`, `ORG`, `PER`
## Dataset Structure
### Data Instances
An example of `train` of `ja` looks as follows.
```
{
'tokens': ['#', '#', 'ユ', 'リ', 'ウ', 'ス', '・', 'ベ', 'ー', 'リ', 'ッ', 'ク', '#', '1', '9','9','9'],
'tags': [6, 6, 2, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6]
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/wikiann/raw/main/dataset/label.json).
```python
{
"B-LOC": 0,
"B-ORG": 1,
"B-PER": 2,
"I-LOC": 3,
"I-ORG": 4,
"I-PER": 5,
"O": 6
}
```
### Data Splits
| language | train | validation | test |
|:-------------|--------:|-------------:|-------:|
| ace | 100 | 100 | 100 |
| bg | 20000 | 10000 | 10000 |
| da | 20000 | 10000 | 10000 |
| fur | 100 | 100 | 100 |
| ilo | 100 | 100 | 100 |
| lij | 100 | 100 | 100 |
| mzn | 100 | 100 | 100 |
| qu | 100 | 100 | 100 |
| su | 100 | 100 | 100 |
| vi | 20000 | 10000 | 10000 |
| af | 5000 | 1000 | 1000 |
| bh | 100 | 100 | 100 |
| de | 20000 | 10000 | 10000 |
| fy | 1000 | 1000 | 1000 |
| io | 100 | 100 | 100 |
| lmo | 100 | 100 | 100 |
| nap | 100 | 100 | 100 |
| rm | 100 | 100 | 100 |
| sv | 20000 | 10000 | 10000 |
| vls | 100 | 100 | 100 |
| als | 100 | 100 | 100 |
| bn | 10000 | 1000 | 1000 |
| diq | 100 | 100 | 100 |
| ga | 1000 | 1000 | 1000 |
| is | 1000 | 1000 | 1000 |
| ln | 100 | 100 | 100 |
| nds | 100 | 100 | 100 |
| ro | 20000 | 10000 | 10000 |
| sw | 1000 | 1000 | 1000 |
| vo | 100 | 100 | 100 |
| am | 100 | 100 | 100 |
| bo | 100 | 100 | 100 |
| dv | 100 | 100 | 100 |
| gan | 100 | 100 | 100 |
| it | 20000 | 10000 | 10000 |
| lt | 10000 | 10000 | 10000 |
| ne | 100 | 100 | 100 |
| ru | 20000 | 10000 | 10000 |
| szl | 100 | 100 | 100 |
| wa | 100 | 100 | 100 |
| an | 1000 | 1000 | 1000 |
| br | 1000 | 1000 | 1000 |
| el | 20000 | 10000 | 10000 |
| gd | 100 | 100 | 100 |
| ja | 20000 | 10000 | 10000 |
| lv | 10000 | 10000 | 10000 |
| nl | 20000 | 10000 | 10000 |
| rw | 100 | 100 | 100 |
| ta | 15000 | 1000 | 1000 |
| war | 100 | 100 | 100 |
| ang | 100 | 100 | 100 |
| bs | 15000 | 1000 | 1000 |
| eml | 100 | 100 | 100 |
| gl | 15000 | 10000 | 10000 |
| jbo | 100 | 100 | 100 |
| map-bms | 100 | 100 | 100 |
| nn | 20000 | 1000 | 1000 |
| sa | 100 | 100 | 100 |
| te | 1000 | 1000 | 1000 |
| wuu | 100 | 100 | 100 |
| ar | 20000 | 10000 | 10000 |
| ca | 20000 | 10000 | 10000 |
| en | 20000 | 10000 | 10000 |
| gn | 100 | 100 | 100 |
| jv | 100 | 100 | 100 |
| mg | 100 | 100 | 100 |
| no | 20000 | 10000 | 10000 |
| sah | 100 | 100 | 100 |
| tg | 100 | 100 | 100 |
| xmf | 100 | 100 | 100 |
| arc | 100 | 100 | 100 |
| cbk-zam | 100 | 100 | 100 |
| eo | 15000 | 10000 | 10000 |
| gu | 100 | 100 | 100 |
| ka | 10000 | 10000 | 10000 |
| mhr | 100 | 100 | 100 |
| nov | 100 | 100 | 100 |
| scn | 100 | 100 | 100 |
| th | 20000 | 10000 | 10000 |
| yi | 100 | 100 | 100 |
| arz | 100 | 100 | 100 |
| cdo | 100 | 100 | 100 |
| es | 20000 | 10000 | 10000 |
| hak | 100 | 100 | 100 |
| kk | 1000 | 1000 | 1000 |
| mi | 100 | 100 | 100 |
| oc | 100 | 100 | 100 |
| sco | 100 | 100 | 100 |
| tk | 100 | 100 | 100 |
| yo | 100 | 100 | 100 |
| as | 100 | 100 | 100 |
| ce | 100 | 100 | 100 |
| et | 15000 | 10000 | 10000 |
| he | 20000 | 10000 | 10000 |
| km | 100 | 100 | 100 |
| min | 100 | 100 | 100 |
| or | 100 | 100 | 100 |
| sd | 100 | 100 | 100 |
| tl | 10000 | 1000 | 1000 |
| zea | 100 | 100 | 100 |
| ast | 1000 | 1000 | 1000 |
| ceb | 100 | 100 | 100 |
| eu | 10000 | 10000 | 10000 |
| hi | 5000 | 1000 | 1000 |
| kn | 100 | 100 | 100 |
| mk | 10000 | 1000 | 1000 |
| os | 100 | 100 | 100 |
| sh | 20000 | 10000 | 10000 |
| tr | 20000 | 10000 | 10000 |
| zh-classical | 100 | 100 | 100 |
| ay | 100 | 100 | 100 |
| ckb | 1000 | 1000 | 1000 |
| ext | 100 | 100 | 100 |
| hr | 20000 | 10000 | 10000 |
| ko | 20000 | 10000 | 10000 |
| ml | 10000 | 1000 | 1000 |
| pa | 100 | 100 | 100 |
| si | 100 | 100 | 100 |
| tt | 1000 | 1000 | 1000 |
| zh-min-nan | 100 | 100 | 100 |
| az | 10000 | 1000 | 1000 |
| co | 100 | 100 | 100 |
| fa | 20000 | 10000 | 10000 |
| hsb | 100 | 100 | 100 |
| ksh | 100 | 100 | 100 |
| mn | 100 | 100 | 100 |
| pdc | 100 | 100 | 100 |
| simple | 20000 | 1000 | 1000 |
| ug | 100 | 100 | 100 |
| zh-yue | 20000 | 10000 | 10000 |
| ba | 100 | 100 | 100 |
| crh | 100 | 100 | 100 |
| fi | 20000 | 10000 | 10000 |
| hu | 20000 | 10000 | 10000 |
| ku | 100 | 100 | 100 |
| mr | 5000 | 1000 | 1000 |
| pl | 20000 | 10000 | 10000 |
| sk | 20000 | 10000 | 10000 |
| uk | 20000 | 10000 | 10000 |
| zh | 20000 | 10000 | 10000 |
| bar | 100 | 100 | 100 |
| cs | 20000 | 10000 | 10000 |
| fiu-vro | 100 | 100 | 100 |
| hy | 15000 | 1000 | 1000 |
| ky | 100 | 100 | 100 |
| ms | 20000 | 1000 | 1000 |
| pms | 100 | 100 | 100 |
| sl | 15000 | 10000 | 10000 |
| ur | 20000 | 1000 | 1000 |
| bat-smg | 100 | 100 | 100 |
| csb | 100 | 100 | 100 |
| fo | 100 | 100 | 100 |
| ia | 100 | 100 | 100 |
| la | 5000 | 1000 | 1000 |
| mt | 100 | 100 | 100 |
| pnb | 100 | 100 | 100 |
| so | 100 | 100 | 100 |
| uz | 1000 | 1000 | 1000 |
| be-x-old | 5000 | 1000 | 1000 |
| cv | 100 | 100 | 100 |
| fr | 20000 | 10000 | 10000 |
| id | 20000 | 10000 | 10000 |
| lb | 5000 | 1000 | 1000 |
| mwl | 100 | 100 | 100 |
| ps | 100 | 100 | 100 |
| sq | 5000 | 1000 | 1000 |
| vec | 100 | 100 | 100 |
| be | 15000 | 1000 | 1000 |
| cy | 10000 | 1000 | 1000 |
| frr | 100 | 100 | 100 |
| ig | 100 | 100 | 100 |
| li | 100 | 100 | 100 |
| my | 100 | 100 | 100 |
| pt | 20000 | 10000 | 10000 |
| sr | 20000 | 10000 | 10000 |
| vep | 100 | 100 | 100 |
### Citation Information
```
@inproceedings{pan-etal-2017-cross,
title = "Cross-lingual Name Tagging and Linking for 282 Languages",
author = "Pan, Xiaoman and
Zhang, Boliang and
May, Jonathan and
Nothman, Joel and
Knight, Kevin and
Ji, Heng",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1178",
doi = "10.18653/v1/P17-1178",
pages = "1946--1958",
abstract = "The ambitious goal of this work is to develop a cross-lingual name tagging and linking framework for 282 languages that exist in Wikipedia. Given a document in any of these languages, our framework is able to identify name mentions, assign a coarse-grained or fine-grained type to each mention, and link it to an English Knowledge Base (KB) if it is linkable. We achieve this goal by performing a series of new KB mining methods: generating {``}silver-standard{''} annotations by transferring annotations from English to other languages through cross-lingual links and KB properties, refining annotations through self-training and topic selection, deriving language-specific morphology features from anchor links, and mining word translation pairs from cross-lingual links. Both name tagging and linking results for 282 languages are promising on Wikipedia data and on-Wikipedia data.",
}
``` |
CyberHarem/yukikaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yukikaze/雪風 (Kantai Collection)
This is the dataset of yukikaze/雪風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, headgear, hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 529.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukikaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 329.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukikaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1130 | 689.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukikaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 478.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukikaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1130 | 933.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukikaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yukikaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blue_sailor_collar, open_mouth, simple_background, smile, solo, speaking_tube_headset, upper_teeth_only, looking_at_viewer, round_teeth, sailor_dress, white_background, yellow_neckerchief, twitter_username, upper_body |
| 1 | 19 |  |  |  |  |  | 1girl, hair_flower, open_mouth, sailor_dress, smile, solo, upper_teeth_only, grey_neckerchief, white_dress, black_sailor_collar, long_sleeves, round_teeth, speaking_tube_headset, blue_sailor_collar, cherry_blossoms, simple_background, white_background, anchor_symbol, pink_flower, blush, cowboy_shot, full_body |
| 2 | 10 |  |  |  |  |  | 1girl, hair_flower, sailor_dress, solo, cherry_blossoms, grey_neckerchief, upper_body, white_dress, long_sleeves, simple_background, white_background, black_sailor_collar, grey_necktie, smile, speaking_tube_headset, blue_sailor_collar, blush, looking_at_viewer, closed_mouth, pink_flower |
| 3 | 14 |  |  |  |  |  | 1girl, sailor_dress, smile, solo, binoculars, open_mouth, looking_at_viewer, salute, school_uniform |
| 4 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, sailor_dress, solo, white_panties, open_mouth, smile, binoculars, character_name |
| 5 | 9 |  |  |  |  |  | 2girls, open_mouth, sailor_dress, binoculars, blonde_hair, blush, long_hair, smile, white_panties |
| 6 | 5 |  |  |  |  |  | 1girl, anchor_symbol, cowboy_shot, looking_at_viewer, open_mouth, smile, solo, speaking_tube_headset, straw_hat, sun_hat, sundress, white_dress, collarbone, upper_teeth_only, bow, official_alternate_costume, bag, blush, hair_between_eyes, hat_flower, jewelry, off-shoulder_dress, round_teeth, simple_background, sunflower, white_background |
| 7 | 8 |  |  |  |  |  | 1girl, anchor_symbol, blue_sky, cloud, day, looking_at_viewer, open_mouth, outdoors, solo, speaking_tube_headset, straw_hat, sun_hat, sundress, white_dress, smile, sunflower, bow, upper_teeth_only, collarbone, hat_flower, upper_body, yellow_flower |
| 8 | 6 |  |  |  |  |  | 1girl, double_bun, official_alternate_costume, open_mouth, solo, looking_at_viewer, smile, upper_teeth_only, white_shirt, blush, upper_body |
| 9 | 20 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_jacket, school_swimsuit, hooded_jacket, speaking_tube_headset, smile, hoodie, name_tag, open_mouth, blush, collarbone, long_sleeves, swimsuit_under_clothes, blue_one-piece_swimsuit, hair_between_eyes, teeth, black_one-piece_swimsuit, sitting |
| 10 | 5 |  |  |  |  |  | denim_jacket, hair_flower, official_alternate_costume, 1girl, blue_headwear, cowboy_shot, open_mouth, smile, solo, bag, blue_jacket, round_teeth, upper_teeth_only, white_skirt, beret, black_headwear, breast_pocket, cherry_blossoms, long_sleeves, white_dress |
| 11 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, solo_focus, bottomless, navel, penis, sailor_dress, sex, small_breasts, spread_legs, vaginal, covered_nipples, cum_in_pussy, see-through, socks, bar_censor, loli, mosaic_censoring, neckerchief, pov, school_uniform, simple_background, sweat, teeth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | open_mouth | simple_background | smile | solo | speaking_tube_headset | upper_teeth_only | looking_at_viewer | round_teeth | sailor_dress | white_background | yellow_neckerchief | twitter_username | upper_body | hair_flower | grey_neckerchief | white_dress | black_sailor_collar | long_sleeves | cherry_blossoms | anchor_symbol | pink_flower | blush | cowboy_shot | full_body | grey_necktie | closed_mouth | binoculars | salute | school_uniform | white_panties | character_name | 2girls | blonde_hair | long_hair | straw_hat | sun_hat | sundress | collarbone | bow | official_alternate_costume | bag | hair_between_eyes | hat_flower | jewelry | off-shoulder_dress | sunflower | blue_sky | cloud | day | outdoors | yellow_flower | double_bun | white_shirt | white_jacket | school_swimsuit | hooded_jacket | hoodie | name_tag | swimsuit_under_clothes | blue_one-piece_swimsuit | teeth | black_one-piece_swimsuit | sitting | denim_jacket | blue_headwear | blue_jacket | white_skirt | beret | black_headwear | breast_pocket | 1boy | hetero | solo_focus | bottomless | navel | penis | sex | small_breasts | spread_legs | vaginal | covered_nipples | cum_in_pussy | see-through | socks | bar_censor | loli | mosaic_censoring | neckerchief | pov | sweat |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------------|:-------------|:--------------------|:--------|:-------|:------------------------|:-------------------|:--------------------|:--------------|:---------------|:-------------------|:---------------------|:-------------------|:-------------|:--------------|:-------------------|:--------------|:----------------------|:---------------|:------------------|:----------------|:--------------|:--------|:--------------|:------------|:---------------|:---------------|:-------------|:---------|:-----------------|:----------------|:-----------------|:---------|:--------------|:------------|:------------|:----------|:-----------|:-------------|:------|:-----------------------------|:------|:--------------------|:-------------|:----------|:---------------------|:------------|:-----------|:--------|:------|:-----------|:----------------|:-------------|:--------------|:---------------|:------------------|:----------------|:---------|:-----------|:-------------------------|:--------------------------|:--------|:---------------------------|:----------|:---------------|:----------------|:--------------|:--------------|:--------|:-----------------|:----------------|:-------|:---------|:-------------|:-------------|:--------|:--------|:------|:----------------|:--------------|:----------|:------------------|:---------------|:--------------|:--------|:-------------|:-------|:-------------------|:--------------|:------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | X | | X | X | | | X | X | X | X | X | X | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | | X | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | | | X | | X | | | | | | X | | | | | | | | | | | | | X | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | | X | | | | | | X | | | | X | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | X | | X | X | X | X | X | | | | | | X | | | X | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | | X | X | | X | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 20 |  |  |  |  |  | X | | X | | X | X | X | | X | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | X | | X | X | | X | | X | | | | | | X | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | X | X | | | | | | | X | X | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
freshpearYoon/train_free_42 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604916520
num_examples: 10000
download_size: 1451750277
dataset_size: 9604916520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
presencesw/pubmed_envi_stage_2_o | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 24041184334.24373
num_examples: 10888215
download_size: 10807299020
dataset_size: 24041184334.24373
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mmaak/HealthCareMagic-llama2-5k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5739555
num_examples: 5000
download_size: 3223162
dataset_size: 5739555
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yeerchiu/mmm_lmd_16bars | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2016007300
num_examples: 115171
download_size: 326574922
dataset_size: 2016007300
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Star3073/Test_Interview | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12365
num_examples: 15
download_size: 11573
dataset_size: 12365
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
haturusinghe/sinhala_off-sinhala-to-english | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5826095
num_examples: 38123
- name: test
num_bytes: 340032
num_examples: 2219
download_size: 2331565
dataset_size: 6166127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/maury_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maury/モーリー/莫里 (Azur Lane)
This is the dataset of maury/モーリー/莫里 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `hair_ornament, hairclip, ahoge, long_hair, blonde_hair, yellow_eyes, hair_between_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 24.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maury_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 17.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maury_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 71 | 38.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maury_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 23.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maury_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 71 | 46.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maury_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maury_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, open_mouth, sleeveless_dress, white_sailor_collar, wristband, :d, bare_shoulders, blue_dress, brown_eyes, sailor_dress, collarbone, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | open_mouth | sleeveless_dress | white_sailor_collar | wristband | :d | bare_shoulders | blue_dress | brown_eyes | sailor_dress | collarbone | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:-------------------|:----------------------|:------------|:-----|:-----------------|:-------------|:-------------|:---------------|:-------------|:--------------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
innodatalabs/rt-inod-finance | ---
license: cc-by-sa-4.0
language: en
task_categories:
- text-generation
- translation
- summarization
- question-answering
- sentence-similarity
tags:
- red teaming
labels:
domain: finance
genre: business docs
skill: paraphrasing, Q&A, summarization, translation
safety: factuality, toxicity
dataset_info:
- config_name: default
data_files:
- split: test
path: innodata_finance_test.jsonl
features:
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
- name: expected
dtype: string
- name: id
dtype: string
---
# FINANCE dataset
Red teaming human-crafted finance dataset.
|
arbitropy/quac_prompt | ---
dataset_info:
features:
- name: story
dtype: string
- name: questions
sequence: string
- name: answers
sequence: string
- name: source
dtype: string
- name: prompt
sequence: string
splits:
- name: train
num_bytes: 262740905
num_examples: 11567
- name: validation
num_bytes: 25350167
num_examples: 1000
download_size: 65921336
dataset_size: 288091072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-f4ef6e-41949145080 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: ARTeLab/it5-summarization-fanpage
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-fanpage
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@neuromentor](https://huggingface.co/neuromentor) for evaluating this model. |
vwxyzjn/openhermes-dev__mistralai_Mistral-7B-Instruct-v0.1__1706885434 | ---
dataset_info:
features:
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: idx
dtype: 'null'
- name: id
dtype: 'null'
- name: model
dtype: 'null'
- name: topic
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: model_name
dtype: 'null'
- name: language
dtype: 'null'
- name: views
dtype: 'null'
- name: hash
dtype: 'null'
- name: category
dtype: string
- name: prompt
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: test_prefs
num_bytes: 1823
num_examples: 1
- name: train_prefs
num_bytes: 128821
num_examples: 23
download_size: 129887
dataset_size: 130644
configs:
- config_name: default
data_files:
- split: test_prefs
path: data/test_prefs-*
- split: train_prefs
path: data/train_prefs-*
---
|
jkorsvik/nowiki_abstract_second_scrape_20230201 | ---
dataset_info:
features:
- name: url
dtype: string
- name: date_scraped
dtype: string
- name: headline
dtype: string
- name: category
dtype: string
- name: ingress
dtype: string
- name: article
dtype: string
- name: abstract
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 841217948
num_examples: 614918
download_size: 211286623
dataset_size: 841217948
---
# Dataset Card for "nowiki_abstract_second_scrape_20230201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stjokerli/TextToText_mnli | ---
license: mit
---
|
open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B | ---
pretty_name: Evaluation run of garage-bAInd/Camel-Platypus2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T06:37:05.018958](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-10-16T06-37-05.018958.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5069211409395973,\n\
\ \"em_stderr\": 0.0051199774044148345,\n \"f1\": 0.559724203020135,\n\
\ \"f1_stderr\": 0.004829732229468497,\n \"acc\": 0.5345469918434537,\n\
\ \"acc_stderr\": 0.01116294273345166\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5069211409395973,\n \"em_stderr\": 0.0051199774044148345,\n\
\ \"f1\": 0.559724203020135,\n \"f1_stderr\": 0.004829732229468497\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2395754359363154,\n \
\ \"acc_stderr\": 0.01175686434407741\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|drop|3_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|drop|3_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T06-37-05.018958.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|gsm8k|5_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T06-37-05.018958.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|winogrande|5_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|winogrande|5_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T06-37-05.018958.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- results_2023-08-18T00:04:49.359575.parquet
- split: 2023_09_23T09_15_03.498663
path:
- results_2023-09-23T09-15-03.498663.parquet
- split: 2023_10_16T06_37_05.018958
path:
- results_2023-10-16T06-37-05.018958.parquet
- split: latest
path:
- results_2023-10-16T06-37-05.018958.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T06:37:05.018958](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-10-16T06-37-05.018958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5069211409395973,
"em_stderr": 0.0051199774044148345,
"f1": 0.559724203020135,
"f1_stderr": 0.004829732229468497,
"acc": 0.5345469918434537,
"acc_stderr": 0.01116294273345166
},
"harness|drop|3": {
"em": 0.5069211409395973,
"em_stderr": 0.0051199774044148345,
"f1": 0.559724203020135,
"f1_stderr": 0.004829732229468497
},
"harness|gsm8k|5": {
"acc": 0.2395754359363154,
"acc_stderr": 0.01175686434407741
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jilp00/water-diplomacy-transcripts | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 316351
num_examples: 223
download_size: 173329
dataset_size: 316351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/nanakusa_nichika_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nanakusa_nichika/七草にちか (THE iDOLM@STER: SHINY COLORS)
This is the dataset of nanakusa_nichika/七草にちか (THE iDOLM@STER: SHINY COLORS), containing 365 images and their tags.
The core tags of this character are `green_hair, short_hair, green_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 365 | 578.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanakusa_nichika_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 365 | 287.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanakusa_nichika_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 870 | 631.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanakusa_nichika_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 365 | 491.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanakusa_nichika_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 870 | 994.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanakusa_nichika_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nanakusa_nichika_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_shirt, blush, hairclip, long_sleeves, sweater, bag, cardigan, holding, simple_background, smile, white_background, white_skirt |
| 1 | 35 |  |  |  |  |  | looking_at_viewer, 1girl, earrings, long_sleeves, frills, solo, nail_polish, smile, neck_ribbon, simple_background, upper_body, blush, jacket, very_long_hair, white_background, red_nails, white_shirt, black_ribbon, belt, open_mouth |
| 2 | 9 |  |  |  |  |  | 1girl, bowtie, looking_at_viewer, school_uniform, simple_background, solo, sweater_vest, upper_body, blush, short_sleeves, white_background, white_shirt, striped_bow |
| 3 | 14 |  |  |  |  |  | 1girl, bowtie, plaid_skirt, school_uniform, short_sleeves, sweater_vest, looking_at_viewer, pleated_skirt, solo, simple_background, white_shirt, blush, white_background, smile, socks, full_body |
| 4 | 12 |  |  |  |  |  | frills, maid_headdress, solo, 1girl, looking_at_viewer, puffy_short_sleeves, wrist_cuffs, blush, smile, hair_ribbon, short_twintails, simple_background, black_bow, bowtie, enmaided, maid_apron, white_background, upper_body |
| 5 | 11 |  |  |  |  |  | 1girl, hair_ornament, smile, solo, white_gloves, idol, looking_at_viewer, blush, dress, open_mouth, blue_skirt, detached_collar, simple_background, sweat, white_background |
| 6 | 5 |  |  |  |  |  | 1girl, earrings, green_jacket, looking_at_viewer, solo, simple_background, sleeveless, white_background, bare_shoulders, midriff, navel, off_shoulder, ponytail, smile, upper_body, black_gloves, blush, closed_mouth, crop_top, fishnets, medium_breasts, necklace, open_jacket, very_long_hair |
| 7 | 9 |  |  |  |  |  | 1girl, simple_background, white_background, blush, looking_at_viewer, navel, solo, collarbone, medium_breasts, panties |
| 8 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, outdoors, solo, blush, blue_sky, day, medium_breasts, collarbone, eyewear_on_head, hairclip, heart-shaped_eyewear, navel, ocean, sunglasses, visor_cap, beach, bracelet, cleavage, polka_dot_bikini, twintails, brown_eyes, brown_hair, cloud, smile |
| 9 | 7 |  |  |  |  |  | 1girl, blush, hetero, penis, solo_focus, 1boy, nipples, cum_in_pussy, mosaic_censoring, open_mouth, sex, on_back, spread_legs, vaginal, medium_breasts, missionary, navel, nude, pubic_hair, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_shirt | blush | hairclip | long_sleeves | sweater | bag | cardigan | holding | simple_background | smile | white_background | white_skirt | earrings | frills | nail_polish | neck_ribbon | upper_body | jacket | very_long_hair | red_nails | black_ribbon | belt | open_mouth | bowtie | school_uniform | sweater_vest | short_sleeves | striped_bow | plaid_skirt | pleated_skirt | socks | full_body | maid_headdress | puffy_short_sleeves | wrist_cuffs | hair_ribbon | short_twintails | black_bow | enmaided | maid_apron | hair_ornament | white_gloves | idol | dress | blue_skirt | detached_collar | sweat | green_jacket | sleeveless | bare_shoulders | midriff | navel | off_shoulder | ponytail | black_gloves | closed_mouth | crop_top | fishnets | medium_breasts | necklace | open_jacket | collarbone | panties | outdoors | blue_sky | day | eyewear_on_head | heart-shaped_eyewear | ocean | sunglasses | visor_cap | beach | bracelet | cleavage | polka_dot_bikini | twintails | brown_eyes | brown_hair | cloud | hetero | penis | solo_focus | 1boy | nipples | cum_in_pussy | mosaic_censoring | sex | on_back | spread_legs | vaginal | missionary | nude | pubic_hair | small_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:--------|:-----------|:---------------|:----------|:------|:-----------|:----------|:--------------------|:--------|:-------------------|:--------------|:-----------|:---------|:--------------|:--------------|:-------------|:---------|:-----------------|:------------|:---------------|:-------|:-------------|:---------|:-----------------|:---------------|:----------------|:--------------|:--------------|:----------------|:--------|:------------|:-----------------|:----------------------|:--------------|:--------------|:------------------|:------------|:-----------|:-------------|:----------------|:---------------|:-------|:--------|:-------------|:------------------|:--------|:---------------|:-------------|:-----------------|:----------|:--------|:---------------|:-----------|:---------------|:---------------|:-----------|:-----------|:-----------------|:-----------|:--------------|:-------------|:----------|:-----------|:-----------|:------|:------------------|:-----------------------|:--------|:-------------|:------------|:--------|:-----------|:-----------|:-------------------|:------------|:-------------|:-------------|:--------|:---------|:--------|:-------------|:-------|:----------|:---------------|:-------------------|:------|:----------|:--------------|:----------|:-------------|:-------|:-------------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | X | X | X | X | | X | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | | X | | | | | | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | X | X | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | | X | | | | | | | X | X | X | | | X | | | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | X | | X | | | | | | | X | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | X | X | X | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 18 |  |  |  |  |  | X | X | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/himari_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of himari/明星ヒマリ/日鞠 (Blue Archive)
This is the dataset of himari/明星ヒマリ/日鞠 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `mole, long_hair, pointy_ears, mole_under_eye, halo, hairband, purple_eyes, hair_ornament, black_hairband, hair_flower, grey_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/himari_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 845.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himari_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1308 | 1.66 GiB | [Download](https://huggingface.co/datasets/CyberHarem/himari_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/himari_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_gloves, hair_tubes, long_sleeves, looking_at_viewer, sitting, solo, smile, white_jacket, blush, simple_background, white_background, closed_mouth, wheelchair, white_flower |
| 1 | 10 |  |  |  |  |  | 1girl, blush, simple_background, smile, upper_body, white_background, white_flower, hair_tubes, looking_at_viewer, solo, white_jacket, open_mouth, long_sleeves, black_gloves |
| 2 | 31 |  |  |  |  |  | 1girl, barefoot, hair_tubes, solo, toes, long_sleeves, looking_at_viewer, black_gloves, blush, white_jacket, simple_background, sitting, full_body, toenails, black_leggings, legs, soles, white_background, closed_mouth, foot_focus, white_flower, smile, striped_hairband, pants |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, hair_tubes, looking_at_viewer, small_breasts, solo, alternate_costume, collarbone, navel, simple_background, sitting, smile, stomach, flower, string_bikini, bare_arms, black_bikini, closed_mouth, side-tie_bikini_bottom, striped_hairband, wheelchair, white_bikini |
| 4 | 15 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, flower, hair_tubes, nipples, open_mouth, penis, sex, small_breasts, completely_nude, cum, sweat, vaginal, bar_censor, navel, on_back |
| 5 | 6 |  |  |  |  |  | 1girl, alternate_costume, fake_animal_ears, playboy_bunny, rabbit_ears, small_breasts, solo, strapless_leotard, detached_collar, looking_at_viewer, bare_shoulders, black_bowtie, black_pantyhose, blush, hair_tubes, open_mouth, rabbit_tail, smile, white_leotard, wrist_cuffs, ass, black_gloves, fake_tail, simple_background, white_background, white_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | hair_tubes | long_sleeves | looking_at_viewer | sitting | solo | smile | white_jacket | blush | simple_background | white_background | closed_mouth | wheelchair | white_flower | upper_body | open_mouth | barefoot | toes | full_body | toenails | black_leggings | legs | soles | foot_focus | striped_hairband | pants | bare_shoulders | small_breasts | alternate_costume | collarbone | navel | stomach | flower | string_bikini | bare_arms | black_bikini | side-tie_bikini_bottom | white_bikini | 1boy | hetero | solo_focus | nipples | penis | sex | completely_nude | cum | sweat | vaginal | bar_censor | on_back | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | detached_collar | black_bowtie | black_pantyhose | rabbit_tail | white_leotard | wrist_cuffs | ass | fake_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------|:---------------|:--------------------|:----------|:-------|:--------|:---------------|:--------|:--------------------|:-------------------|:---------------|:-------------|:---------------|:-------------|:-------------|:-----------|:-------|:------------|:-----------|:-----------------|:-------|:--------|:-------------|:-------------------|:--------|:-----------------|:----------------|:--------------------|:-------------|:--------|:----------|:---------|:----------------|:------------|:---------------|:-------------------------|:---------------|:-------|:---------|:-------------|:----------|:--------|:------|:------------------|:------|:--------|:----------|:-------------|:----------|:-------------------|:----------------|:--------------|:--------------------|:------------------|:---------------|:------------------|:--------------|:----------------|:--------------|:------|:------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | X | X | X | X | | X | X | | X | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | | X | X | X | | | X | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
AI-B/CHI | ---
license: unlicense
tags:
- UNA
- DPO
- ULMA
pretty_name: CHI
---
This repository outlines the methodology for creating training sets aimed at aligning a language model with a specific character and persona.
The process involves utilizing a Direct Preference Optimization (DPO) dataset to steer the model towards embodying the defined character and persona traits.
Following this, a Unified Neutral Alignment (UNA) dataset is employed to moderate any excessive sentiments resulting from the DPO training.
The final step involves merging the model realigned with the UNA dataset into the original DPO-trained model, forming a Unified Language Model Alignment (ULMA).
### DPO Training Set (Target Character and Persona)
1. **Define Character and Persona**:
Precisely define the traits, behaviors, and speech patterns of the intended character and persona, including language style, tone, typical responses, and unique characteristics.
2. **Dataset Construction**:
Develop a dataset that reflects these characteristics through dialogues, monologues, and interactions typical of the persona. Ensure the dataset's diversity to encompass various scenarios and responses.
3. **Annotation**:
Label each dataset instance with preference scores or binary labels, indicating its alignment with the target persona for effective DPO implementation.
### UNA Training Set (Neutralizing Extremes)
1. **Identify Extremes**:
Identify extreme positive or negative sentiments in the context of your character, such as overly aggressive or excessively submissive language.
2. **Neutral Dataset**:
Build a dataset representing neutral interactions and responses, focusing on language and replies that are balanced and free from identified extremes.
3. **Annotation for Neutrality**:
Annotate the dataset to promote a neutral, balanced language style, possibly employing a point-wise preference approach similar to DPO.
### Training and Merging Models
1. **Train Separate Models**:
Train one model using the DPO dataset and subsequently realign it using the UNA dataset. Each model will learn distinct aspects: character alignment and neutralization of extremes.
2. **Merging Models**:
Combining two independently trained models into a single unified model is complex and often requires sophisticated techniques and deep understanding of model architectures. For this, we employ the `LazyMergeKit`.
3. **Evaluation and Adjustment**:
Post-merging, assess the unified model to verify if it achieves the intended balance. Iterative refinement of the training datasets and merging process might be necessary based on evaluation outcomes. |
SerahAKojenu/Assignment1 | ---
dataset_info:
features:
- name: longitude
dtype: float64
- name: latitude
dtype: float64
- name: housing_median_age
dtype: float64
- name: total_rooms
dtype: float64
- name: total_bedrooms
dtype: float64
- name: population
dtype: float64
- name: households
dtype: float64
- name: median_income
dtype: float64
- name: median_house_value
dtype: float64
- name: ocean_proximity
dtype: string
splits:
- name: train
num_bytes: 1737680
num_examples: 20640
download_size: 0
dataset_size: 1737680
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Assignment1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train6000_eval6489_v1_recite_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 697367
num_examples: 6000
- name: train_ic_qa
num_bytes: 4540536
num_examples: 6000
- name: train_recite_qa
num_bytes: 4546536
num_examples: 6000
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: eval_ic_qa
num_bytes: 4906186
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 7126313
num_examples: 10925
- name: all_docs_eval
num_bytes: 7125701
num_examples: 10925
- name: train
num_bytes: 11672849
num_examples: 16925
- name: validation
num_bytes: 4912675
num_examples: 6489
download_size: 31822578
dataset_size: 51193640
---
# Dataset Card for "lmind_nq_train6000_eval6489_v1_recite_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akanametov/minions-dataset | ---
license: mit
---
|
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_T_D_PNP_GENERIC_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 12334750
num_examples: 1000
download_size: 2117039
dataset_size: 12334750
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_T_D_PNP_GENERIC_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
talentlabs/training-data-blog-writer_v03-09-2023 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 48639092
num_examples: 9504
download_size: 30032406
dataset_size: 48639092
---
# Dataset Card for "training-data-blog-writer_v03-09-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MMInstruction/ArxivQA | ---
license: cc-by-sa-4.0
task_categories:
- image-to-text
language:
- en
tags:
- 'vision-language '
- vqa
pretty_name: ArxivQA
size_categories:
- 10K<n<100K
---
# Dataset Card for Mutlimodal Arxiv QA
## Dataset Loading Instruction
Each line of the `arxivqa.jsonl` file is an example:
```
{"id": "cond-mat-2862",
"image": "images/0805.4509_1.jpg",
"options": ["A) The ordering temperatures for all materials are above the normalized temperature \\( T/T_c \\) of 1.2.", "B) The magnetic ordering temperatures decrease for Dy, Tb, and Ho as the normalized temperature \\( T/T_c \\) approaches 1.", "C) The magnetic ordering temperatures for all materials are the same across the normalized temperature \\( T/T_c \\).", "D) The magnetic ordering temperature is highest for Yttrium (Y) and decreases for Dy, Tb, and Ho."],
"question": "What can be inferred about the magnetic ordering temperatures of the materials tested as shown in the graph?",
"label": "B",
"rationale": "The graph shows a sharp decline in frequency as the normalized temperature \\( T/T_c \\) approaches 1 for Dy, Tb, and Ho, indicating that their magnetic ordering temperatures decrease. No such data is shown for Yttrium (Y), thus we can't infer it has the highest magnetic ordering temperature."
}
```
- Download the `arxivqa.json` and `images.tgz` to your machine.
- Decompress images: `tar -xzvf images.tgz`.
- Loading the dataset and process the sample according to your need.
```python3
import json
with open("arxivqa.jsonl", 'r') as fr:
arxiv_qa = [ json.loads(line.strip()) for line in fr]
sample = arxiv_qa[0]
print(sample["image"]) # image file
```
## Dataset details
**Dataset type**: ArxivQA is a set of GPT4V-generated VQA samples based on figures from Arxiv Papers.
**Papers or resources for more information**: https://mm-arxiv.github.io/
**License**: CC-BY-SA-4.0; and it should abide by the policy of OpenAI:
https://openai.com/policies/terms-of-use
**Intended use**:
Primary intended uses: The primary use of ArxivQA is research on large multimodal models.
Primary intended users: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. |
open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo | ---
pretty_name: Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6323249282667325,\n\
\ \"acc_stderr\": 0.03235123186693868,\n \"acc_norm\": 0.63602882598941,\n\
\ \"acc_norm_stderr\": 0.03299471578731984,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n\
\ \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6661023700458076,\n\
\ \"acc_stderr\": 0.004706398252382464,\n \"acc_norm\": 0.8579964150567616,\n\
\ \"acc_norm_stderr\": 0.0034834044902359936\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478923,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848033,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848033\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n\
\ \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n\
\ \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n\
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899129,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899129\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399682,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n\
\ \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4829416224412434,\n \
\ \"acc_stderr\": 0.013764467123761318\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- '**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet'
- config_name: results
data_files:
- split: 2024_01_14T01_13_57.359475
path:
- results_2024-01-14T01-13-57.359475.parquet
- split: latest
path:
- results_2024-01-14T01-13-57.359475.parquet
---
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6323249282667325,
"acc_stderr": 0.03235123186693868,
"acc_norm": 0.63602882598941,
"acc_norm_stderr": 0.03299471578731984,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6661023700458076,
"acc_stderr": 0.004706398252382464,
"acc_norm": 0.8579964150567616,
"acc_norm_stderr": 0.0034834044902359936
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848033,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899129,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899129
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399682,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825362,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825362
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.4829416224412434,
"acc_stderr": 0.013764467123761318
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yamei/TVCG_entity_classify | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
- name: entity_type
dtype: string
- name: entity_type_high
dtype: string
- name: label
dtype:
class_label:
names:
'0': method
'1': evaluation
'2': data
'3': background
'4': Author
'5': DBPedia
'6': Affiliation
'7': Paper
'8': Journal
splits:
- name: train
num_bytes: 3565888.348566884
num_examples: 47281
- name: test
num_bytes: 891528.6514331156
num_examples: 11821
download_size: 1857856
dataset_size: 4457417.0
---
# Dataset Card for "TVCG_entity_classify"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tellarin-ai/ntx_llm_inst_hindi | ---
license: cc-by-sa-4.0
language:
- hi
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Hindi subset
This dataset is a format conversion for the Hindi data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
MBZUAI/LaMini-Hallucination | ---
dataset_info:
features:
- name: question
dtype: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 2785
num_examples: 40
download_size: 3220
dataset_size: 2785
---
# Dataset Card for "LaMini-Hallucination"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Citation
```
@article{lamini-lm,
author = {Minghao Wu and
Abdul Waheed and
Chiyu Zhang and
Muhammad Abdul-Mageed and
Alham Fikri Aji
},
title = {LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions},
journal = {CoRR},
volume = {abs/2304.14402},
year = {2023},
url = {https://arxiv.org/abs/2304.14402},
eprinttype = {arXiv},
eprint = {2304.14402}
}
``` |
yzhuang/metatree_vehicle_sensIT | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float32
- name: y
dtype: int64
splits:
- name: train
num_bytes: 28973280
num_examples: 68984
- name: validation
num_bytes: 12408480
num_examples: 29544
download_size: 60104700
dataset_size: 41381760
---
# Dataset Card for "metatree_vehicle_sensIT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
richardr1126/spider-context-validation | ---
language:
- en
license:
- cc-by-4.0
source_datasets:
- spider
pretty_name: Spider Context Validation
tags:
- text-to-sql
- SQL
- spider
- validation
- eval
- spider-eval
dataset_info:
features:
- name: db_id
dtype: string
- name: question
dtype: string
- name: db_info
dtype: string
- name: ground_truth
dtype: string
---
# Dataset Card for Spider Context Validation
### Dataset Summary
Spider is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
This dataset was created to validate spider-fine-tuned LLMs with database context.
### Yale Lily Spider Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
### Citation
```
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
}
``` |
AUTOMATIC/jaicards | ---
license: mit
task_categories:
- conversational
- text-generation
size_categories:
- 100K<n<1M
---
# janitorai-cards
This dataset contains 190k cards that I received from janitorai, from a source that wished to remain anonymous.
My addition to this data is conversion of cards to [v2 character card](https://github.com/malfoyslastname/character-card-spec-v2/blob/main/README.md) format, and a local webpage that can be used to explore the dataset.
### Webpage

Ther webpage lets you browse cards, search by text, fitler by tags and order by date/name/popularity.
To use the webpage, put [index.html](index.html) into a directory, and download and extract archives into same directory: [0123.zip](0123.zip), [4567.zip](4567.zip), [89ab.zip](89ab.zip), [cdef.zip](cdef.zip), and [html.zip](html.zip).
After that, just open [index.html](index.html) in the browser.
The directory structure should look like this:
```
📁
┣━━ 📄 index.html
┣━━ 📁 cards
┃ ┣━━ 📁 0
┃ ┣━━ 📁 1
┃ ┃ ...
┃ ┗━━ 📁 f
┗━━ 📁 html
┣━━ 📄 allcards.js
┣━━ 📄 cards.js
┗━━ 📄 cardsmeta.js
```
For performance reasons, the webpage only loads 10000 most popular cards when you open it. To view all, click the "Load all" button in the top row.
Caveat: instead of downloading the card, it opens it in a new page—you have to save it yourself. I can't figure out how to get the download to work.
### Files
- [0123.zip](0123.zip), [4567.zip](4567.zip), [89ab.zip](89ab.zip), [cdef.zip](cdef.zip) - archives with v2 character cards, tested to work with SillyTavern.
- [cards-js.7z](cards-js.7z) - all v2 character cards in json format, without images, tested to work with SillyTavern.
- [index.html](index.html) - webpage for browsing cards.
- [html.zip](html.zip) - files with information about cards - it's needed for the webpage to function.
- [orig.7z](orig.7z) - original json files with cards from janitorai - not compatible with any software.
|
Maciel/e-commerce-sample-images | ---
license: apache-2.0
---
|
camilaslz/LEOFELIPE | ---
license: openrail
---
|
DataLinguistic/MutiDataset | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-project-jnlpba-c103d433-1295449602 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jnlpba
eval_info:
task: entity_extraction
model: siddharthtumre/biobert-ner
metrics: []
dataset_name: jnlpba
dataset_config: jnlpba
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: siddharthtumre/biobert-ner
* Dataset: jnlpba
* Config: jnlpba
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@siddharthtumre](https://huggingface.co/siddharthtumre) for evaluating this model. |
heliosprime/twitter_dataset_1713189721 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15533
num_examples: 42
download_size: 16373
dataset_size: 15533
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713189721"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_existential_got | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 183289
num_examples: 843
- name: dev_mismatched
num_bytes: 163475
num_examples: 689
- name: test_matched
num_bytes: 182488
num_examples: 842
- name: test_mismatched
num_bytes: 155896
num_examples: 712
- name: train
num_bytes: 7382865
num_examples: 33251
download_size: 5028376
dataset_size: 8068013
---
# Dataset Card for "MULTI_VALUE_mnli_existential_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chlee10__T3Q-Platypus-SOLAR | ---
pretty_name: Evaluation run of chlee10/T3Q-Platypus-SOLAR
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chlee10/T3Q-Platypus-SOLAR](https://huggingface.co/chlee10/T3Q-Platypus-SOLAR)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chlee10__T3Q-Platypus-SOLAR\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T05:48:31.143734](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Platypus-SOLAR/blob/main/results_2024-03-12T05-48-31.143734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5426000137755855,\n\
\ \"acc_stderr\": 0.03377258305011606,\n \"acc_norm\": 0.543700672611483,\n\
\ \"acc_norm_stderr\": 0.034480897315487986,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5066765475632212,\n\
\ \"mc2_stderr\": 0.01508715500679457\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n\
\ \"acc_stderr\": 0.004778081784542405,\n \"acc_norm\": 0.8417645887273452,\n\
\ \"acc_norm_stderr\": 0.0036421571661623495\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467523,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467523\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808779,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808779\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392869,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392869\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6290322580645161,\n \"acc_stderr\": 0.02748054188795359,\n \"\
acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.02748054188795359\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6972477064220184,\n \"acc_stderr\": 0.019698711434756343,\n \"\
acc_norm\": 0.6972477064220184,\n \"acc_norm_stderr\": 0.019698711434756343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656629,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656629\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159287,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159287\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.038956324641389366,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.038956324641389366\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357629,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357629\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.02769033753648537,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.02769033753648537\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3956975228161669,\n\
\ \"acc_stderr\": 0.012489290735449018,\n \"acc_norm\": 0.3956975228161669,\n\
\ \"acc_norm_stderr\": 0.012489290735449018\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5066765475632212,\n\
\ \"mc2_stderr\": 0.01508715500679457\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43745261561789234,\n \
\ \"acc_stderr\": 0.013664299060751915\n }\n}\n```"
repo_url: https://huggingface.co/chlee10/T3Q-Platypus-SOLAR
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|arc:challenge|25_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|gsm8k|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hellaswag|10_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-48-31.143734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T05-48-31.143734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- '**/details_harness|winogrande|5_2024-03-12T05-48-31.143734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T05-48-31.143734.parquet'
- config_name: results
data_files:
- split: 2024_03_12T05_48_31.143734
path:
- results_2024-03-12T05-48-31.143734.parquet
- split: latest
path:
- results_2024-03-12T05-48-31.143734.parquet
---
# Dataset Card for Evaluation run of chlee10/T3Q-Platypus-SOLAR
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chlee10/T3Q-Platypus-SOLAR](https://huggingface.co/chlee10/T3Q-Platypus-SOLAR) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chlee10__T3Q-Platypus-SOLAR",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T05:48:31.143734](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Platypus-SOLAR/blob/main/results_2024-03-12T05-48-31.143734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5426000137755855,
"acc_stderr": 0.03377258305011606,
"acc_norm": 0.543700672611483,
"acc_norm_stderr": 0.034480897315487986,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5066765475632212,
"mc2_stderr": 0.01508715500679457
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685247
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542405,
"acc_norm": 0.8417645887273452,
"acc_norm_stderr": 0.0036421571661623495
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467523,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467523
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808779,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808779
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392869,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392869
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.02748054188795359,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.02748054188795359
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6972477064220184,
"acc_stderr": 0.019698711434756343,
"acc_norm": 0.6972477064220184,
"acc_norm_stderr": 0.019698711434756343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656629,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656629
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159287,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159287
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.038956324641389366,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.038956324641389366
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357629,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357629
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409153,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409153
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.02769033753648537,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.02769033753648537
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3956975228161669,
"acc_stderr": 0.012489290735449018,
"acc_norm": 0.3956975228161669,
"acc_norm_stderr": 0.012489290735449018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339193,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5066765475632212,
"mc2_stderr": 0.01508715500679457
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.43745261561789234,
"acc_stderr": 0.013664299060751915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mattymchen/refinedweb-3m | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7834920949
num_examples: 3000000
download_size: 4904877808
dataset_size: 7834920949
---
# Dataset Card for "refinedweb-3m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
QizhiPei/BioT5_finetune_dataset | ---
license: mit
language:
- en
---
## References
For more information, please refer to our paper and GitHub repository.
Paper: [BioT5: Enriching Cross-modal Integration in Biology with Chemical Knowledge and Natural Language Associations](https://arxiv.org/abs/2310.07276)
GitHub: [BioT5](https://github.com/QizhiPei/BioT5)
Authors: *Qizhi Pei, Wei Zhang, Jinhua Zhu, Kehan Wu, Kaiyuan Gao, Lijun Wu, Yingce Xia, and Rui Yan* |
davanstrien/testpapercomments-ds | ---
dataset_info:
features:
- name: paper_url
dtype: string
- name: comment
dtype: string
splits:
- name: train
num_bytes: 502672
num_examples: 456
download_size: 0
dataset_size: 502672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testpapercomments-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orgcatorg/stripes | ---
configs:
- config_name: Africa
data_files:
- split: train
path: Africa/train-*
- config_name: Asia-Pacific
data_files:
- split: train
path: Asia-Pacific/train-*
- config_name: Europe
data_files:
- split: train
path: Europe/train-*
- config_name: Middle East
data_files:
- split: train
path: Middle East/train-*
- config_name: US
data_files:
- split: train
path: US/train-*
dataset_info:
- config_name: Africa
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
- name: image_caption
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 873549
num_examples: 175
download_size: 530795
dataset_size: 873549
- config_name: Asia-Pacific
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
- name: image_caption
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 2597100
num_examples: 596
download_size: 1526683
dataset_size: 2597100
- config_name: Europe
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
- name: image_caption
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 6333893
num_examples: 1241
download_size: 3748163
dataset_size: 6333893
- config_name: Middle East
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
- name: image_caption
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 6203258
num_examples: 958
download_size: 3539626
dataset_size: 6203258
- config_name: US
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
- name: image_caption
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 6964806
num_examples: 1220
download_size: 4135894
dataset_size: 6964806
---
# Dataset Card for "stripes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_10000_eye_movements_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 155715478
dataset_size: 472880000
---
# Dataset Card for "autotree_automl_10000_eye_movements_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713038771 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11220
num_examples: 25
download_size: 8392
dataset_size: 11220
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713038771"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
communityai/Open-Orca___1million-gpt-4-200k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 371133122.45702064
num_examples: 200000
download_size: 196792432
dataset_size: 371133122.45702064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tacoz/audCatImages | ---
license: openrail
---
|
vekkt/french_CEFR | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_198 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1071458640.0
num_examples: 210420
download_size: 1093799592
dataset_size: 1071458640.0
---
# Dataset Card for "chunk_198"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
5mN/Sum-assistant-v1 | ---
task_categories:
- question-answering
language:
- en
- hr
pretty_name: Univeristy of Mostar Assistant
--- |
one-sec-cv12/chunk_160 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22251248064.5
num_examples: 231668
download_size: 20138689166
dataset_size: 22251248064.5
---
# Dataset Card for "chunk_160"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Luizagrod23/RyotaSakuraba | ---
license: openrail
---
|
ddonuts/recurrent-events | ---
license: other
---
|
SauravMaheshkar/NDC-substances-25 | ---
license: unknown
task_categories:
- graph-ml
tags:
- chemistry
configs:
- config_name: transductive
data_files:
- split: train
path: "processed/transductive/train_df.csv"
- split: valid
path: "processed/transductive/val_df.csv"
- split: test
path: "processed/transductive/test_df.csv"
- config_name: inductive
data_files:
- split: train
path: "processed/inductive/train_df.csv"
- split: valid
path: "processed/inductive/val_df.csv"
- split: test
path: "processed/inductive/test_df.csv"
- config_name: raw
data_files: "raw/*.txt"
---
Source Paper: https://arxiv.org/abs/1802.06916
### Usage
```
from torch_geometric.datasets.cornell import CornellTemporalHyperGraphDataset
dataset = CornellTemporalHyperGraphDataset(root = "./", name="NDC-substances-25", split="train")
```
### Citation
```misc
@article{Benson-2018-simplicial,
author = {Benson, Austin R. and Abebe, Rediet and Schaub, Michael T. and Jadbabaie, Ali and Kleinberg, Jon},
title = {Simplicial closure and higher-order link prediction},
year = {2018},
doi = {10.1073/pnas.1800683115},
publisher = {National Academy of Sciences},
issn = {0027-8424},
journal = {Proceedings of the National Academy of Sciences}
}
``` |
Ve11ichor/Song_SA_np_input | ---
license: apache-2.0
task_categories:
- text-classification
language:
- zh
size_categories:
- 1K<n<10K
--- |
aisc-team-d1/PMC_Data | ---
configs:
- config_name: default
data_files:
- split: train
path: train-*
- split: test
path: test-*
dataset_info:
features:
- name: PMC_id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: inline
dtype: string
- name: img_ref
dtype: string
splits:
- name: train
num_bytes: 634222272
num_examples: 316838
- name: test
num_bytes: 253538916
num_examples: 120836
download_size: 139781550
dataset_size: 887761188
---
# PMC-CaseReport Dataset
- [PMC-CaseReport_original Dataset](#pmc-casereport-dataset)
- [Daraset Structure](#dataset-structure)
- [Sample](#sample)
This is the text parts and the figure parts can be dowloaded from https://pan.baidu.com/s/1Src_rhXsaOFp8zJ_3zMFsQ?pwd=p3ne.
## Dataset Structure
**PMC-CaseReport** (Filtered version: 317K VQA pairs for taining and of 121K for testing images).
The dataset can be loading following huggingface datasets rule:
```
from datasets import load_dataset
dataset = load_dataset("chaoyi-wu/PMC-CaseReport_original")
```
-
## Sample
A case in dataset is shown bellow,
| PMC_id | PMC9052276 |
| -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| context | We report the case of a 73-year-old female who presented to the ER with left-sided body weakness of unclear duration.She had an ischemic stroke four years prior with no residual neurologic deficits, a myocardial infarction requiring coronary artery bypass grafting (CABG) two years prior, hypertension, and dementia. Her vital signs were blood pressure (BP) 117/78 mmHg, pulse 121 beats per minute, temperature 98.9 F, respiratory rate (RR) 18 cycles/minute, and oxygen saturation (SpO2) of 97% on ambient air.She was disoriented to place and time with a Glasgow Coma Score (GCS) of 14 (E4V4M6).Her speech was slurred, cranial nerves (CN) 2-12 were grossly intact, motor strength on the left upper and lower extremities was 0/5 and on the right upper and lower extremities was 4/5, and the sensation was preserved in all extremities.The patient had a National Institutes of Health Stroke Scale (NIHSS) score of 16 and a Modified Rankin Score (mRS) of 5 points.A non-contrast head CT scan revealed evidence of old lacuna infarcts in the basal ganglia and thalamus.No intracranial hemorrhage or acute infarct was found.CT perfusion was not done as our center lacks the resources needed to perform that. |
| inline | A brain MRI scan showed an acute pontine stroke (Figures and old infarcts |
| question | What did the brain MRI scan reveal? |
| answer | The brain MRI scan showed an acute pontine stroke and old infarcts. |
| img_ref | "['FIG1', 'FIG3', 'FIG4']" | | |
Explanation to each key
- PMC_id: corresponding PMC paper id.
- context: the context in case report before discussing about the image.
- inline: the inline sentence in original paper for referring and should not be input into network
- question: the genrated question.
- answer: the correct answer.
- img_ref: the list for related img id.
You can get the image form our PMC figure parts, and fig is named unified as ```PMCxxxxxxx_figid.jpg``` like ```PMC9052276_FIG1.jpg```
Note that, we have not filter the context strictly. Thus, in few cases the answer may be leaked in context.
Besides, our PMC figures are collected before this datasets, and during the time window, some papers have been updated. Thus some figures may be missed in our figure base. |
AayushShah/SQL_Merged_IDs_and_Text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: NATURAL_LANG
dtype: string
- name: SCHEMA
dtype: string
- name: SQL
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1089459820.9581463
num_examples: 270986
- name: test
num_bytes: 121052878.04185376
num_examples: 30110
download_size: 101851785
dataset_size: 1210512699.0
---
# Dataset Card for "SQL_Merged_IDs_and_Text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
voidful/wikihow_chat | ---
dataset_info:
features:
- name: article_id
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: related_document_urls_wayback_snapshots
sequence: string
- name: split
dtype: int64
- name: cluster
dtype: int64
- name: dialog
dtype: string
splits:
- name: train
num_bytes: 19620137
num_examples: 8235
- name: test
num_bytes: 5507274
num_examples: 2333
- name: validation
num_bytes: 2810866
num_examples: 1178
download_size: 14161836
dataset_size: 27938277
---
# Dataset Card for "wikihow_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Azie88/COVID_Vaccine_Tweet_sentiment_analysis_roberta | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 1827789
num_examples: 7999
- name: eval
num_bytes: 527000
num_examples: 2000
download_size: 569069
dataset_size: 2354789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
nanyy1025/pubmed_rct_20k | ---
license: openrail
---
|
multimodalart/latent-majesty-diffusion-settings | ---
license: mit
---
A collection of default settings for the text-to-image model [Latent Majesty Diffusion](https://colab.research.google.com/github/multimodalart/majesty-diffusion/blob/main/latent.ipynb). If you love your settings, please add yours by going to the `Files and versions` tab and hitting upload.

Also please add a description on what your settings excel (it's okay if they are general purpose too)
 |
GBaker/MedQA-USMLE-4-options-hf-cosine-similarity | ---
license: cc-by-sa-4.0
---
Original dataset introduced by Jin et al. in [What Disease does this Patient Have? A Large-scale Open Domain Question Answering Dataset from Medical Exams](https://paperswithcode.com/paper/what-disease-does-this-patient-have-a-large)
This version is augmented with context retrieved from the textbooks provided with the original dataset using cosine similarity.
<h4>Citation information:</h4>
@article{jin2020disease,
title={What Disease does this Patient Have? A Large-scale Open Domain Question Answering Dataset from Medical Exams},
author={Jin, Di and Pan, Eileen and Oufattole, Nassim and Weng, Wei-Hung and Fang, Hanyi and Szolovits, Peter},
journal={arXiv preprint arXiv:2009.13081},
year={2020}
} |
Prajapat/banking_conversation_falcon | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 221593
num_examples: 800
download_size: 106357
dataset_size: 221593
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlmarrugom/gin-img-datasets | ---
license: apache-2.0
---
|
RyokoAI/Sensei | ---
license: cc0-1.0
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_4_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 1792
num_examples: 63
download_size: 0
dataset_size: 1792
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_4_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Svenni551/toxic-full-uncensored-v3.1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 42273
num_examples: 43
download_size: 28694
dataset_size: 42273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mHossain/final_train_v4_test_1040000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 7345201.5
num_examples: 18000
- name: test
num_bytes: 816133.5
num_examples: 2000
download_size: 3516028
dataset_size: 8161335.0
---
# Dataset Card for "final_train_v4_test_1040000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wardenga/lsoie | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- extended|qa_srl
task_categories:
- text-retrieval
task_ids: []
pretty_name: LSOIE
tags:
- Open Information Extraction
---
# Dataset Card for LSOIE
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/Jacobsolawetz/large-scale-oie
- **Repository:** https://github.com/Jacobsolawetz/large-scale-oie
- **Paper:** https://arxiv.org/abs/2101.11177
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
The Large Scale Open Information Extraction Dataset (LSOIE), is a dataset 20 times larger than the next largest human-annotated Open Information Extraction (OIE) dataset. LSOIE is a built upon the QA-SRL 2.0 dataset by transforming the list of Questions and answers for each predicate to a tuple representing a fact.
### Supported Tasks and Leaderboards
Open Information Extraction
### Languages
The text in this dataset is english.
## Dataset Structure
### Data Instances
A datapoint comprises one fact together with the sentence it was extracted from. There can be multiple facts for each Sentence. Each fact is represented by a tuple $(a_0, p, a_1,\dots a_n)$ where $a_0$ is the head entity $p$ is the predicate and $a_1, \dots,a_n$ represent the tail.
### Data Fields
- word_ids : sequence of indices (int) representing tokens in a sentence,
- words : a sequence of strings, the tokens in the sentence,
- pred : the predicate of the fact,
- pred_ids : ids of the tokens in the predicate,
- head_pred_id : id of the head token in the predicate,
- sent_id : sentence id,
- run_id : ,
- label : Sequence of tags (BIO) representing the fact, e.g. if the fact is given by $(a_0, p, a_1, \dots, a_n) $
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
hamedhf/nlp_twitter_analysis | ---
license: mit
task_categories:
- text-classification
language:
- fa
- en
--- |
zyy111/aihuihua | ---
license: openrail
---
|
vishal-burman/moe_misspellings | ---
dataset_info:
features:
- name: correct_word
dtype: string
- name: incorrect_words
sequence: string
splits:
- name: train
num_bytes: 278157870
num_examples: 2720843
download_size: 163693499
dataset_size: 278157870
---
# Dataset Card for "moe_misspellings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harouzie/vi_question_generation | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 211814961.2307449
num_examples: 174499
- name: test
num_bytes: 26477628.80776531
num_examples: 21813
- name: valid
num_bytes: 26476414.961489797
num_examples: 21812
download_size: 142790671
dataset_size: 264769005
task_categories:
- question-answering
- text2text-generation
language:
- vi
pretty_name: Vietnamese Dataset for Extractive Question Answering and Question Generation
size_categories:
- 100K<n<1M
--- |
Denviny/LORA | ---
license: other
---
|
recastai/databricks-dolly-15k-chatml | ---
language:
- en
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 34692013
num_examples: 15011
download_size: 15166632
dataset_size: 34692013
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
- text2text-generation
---
# Dataset Card for "databricks-dolly-15k-chatml"
## Dataset Summary
This dataset has been created by **Re:cast AI** to transform the existing dataset [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) into a [chatml](https://huggingface.co/docs/transformers/main/en/chat_templating) friendly format for use in SFT tasks with pretrained models.
## Dataset Structure
```python
messages = [
{ "content": "You are an expert Q&A system that is trusted around the world. You always... etc.", "role": "system" },
{ "content": "(Optional) Context information is below.\n----------------\nVirgin Australia, the... etc.", "role": "user" },
{ "content": "Virgin Australia commenced services on 31 August 2000... etc.", "role": "assistant" } ]
]
```
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("recastai/databricks-dolly-15k-chatml", split="train")
```
## Processing applied to original dataset
```python
INSTRUCTIONS = """You are an expert Q&A system that is trusted around the world. You always answer the user's query in a helpful and friendly way.
Some rules you always follow:
1. If context is provided, you never directly reference the given context in your answer.
2. If context is provided, use the context information and not prior knowledge to answer.
3. Avoid statements like 'Based on the context, ...' or 'The context information ...' or 'The answer to the user's query...' or anything along those lines.
4. If no context is provided use your internal knowledge to answer."""
# databricks-dolly-15k features:
# - instruction: The user query/question
# - context: (optional) context to use to help the assistant
# - response: The assistant's response to the query/question
#
key_mapping = dict(
query = "instruction",
context = "context",
response = "response"
)
def process_chatml_fn(example, validation=False):
"""
Processing specific to databricks-dolly-15k into a chat format.
"""
user_content = (
"(Optional) Context information is below.\n"
"----------------\n"
"{context}\n"
"----------------\n"
"Answer the following query.\n"
"{query}\n"
)
assistant_content = "{response}"
message = [
{"role": "system", "content": INSTRUCTIONS},
{"role": "user", "content": user_content.format(context=example[key_mapping['context']], query=example[key_mapping['query']])},
{"role": "assistant", "content": assistant_content.format(response=example[key_mapping['response']])}
]
return message
``` |
MLNTeam-Unical/NFT-70M_transactions | ---
dataset_info:
features:
- name: num_sales
dtype: int64
- name: fees_seller
dtype: float64
- name: fees_opensea
dtype: float64
- name: fees_seller_usd
dtype: float64
- name: fees_opensea_usd
dtype: float64
- name: tx_timestamp
dtype: string
- name: price
dtype: float64
- name: gain
dtype: float64
- name: usd_price
dtype: float64
- name: usd_gain
dtype: float64
- name: token
dtype: string
- name: to_eth
dtype: float64
- name: to_usd
dtype: float64
- name: created_date
dtype: string
- name: chain
dtype: string
- name: token_type
dtype: string
- name: asset_contract_type
dtype: string
- name: asset_type
dtype: string
- name: payout_collection_address
dtype: int64
- name: from_account
dtype: int64
- name: to_account
dtype: int64
- name: seller_account
dtype: int64
- name: winner_account
dtype: int64
- name: contract_address
dtype: int64
- name: nft_image
dtype: int64
- name: collection_image
dtype: int64
- name: token_id
dtype: int64
- name: nft_name
dtype: int64
- name: nft_description
dtype: int64
- name: collection_name
dtype: int64
- name: collection_description
dtype: int64
splits:
- name: train
num_bytes: 21291348001
num_examples: 70972143
download_size: 6633664673
dataset_size: 21291348001
size_categories:
- 10M<n<100M
license: cc-by-nc-4.0
task_categories:
- time-series-forecasting
- text-classification
- feature-extraction
- text-generation
- zero-shot-classification
- text2text-generation
- sentence-similarity
- image-classification
- image-to-text
- text-to-image
- text-retrieval
language:
- en
tags:
- Non-fungible Tokens
- Crypto
- Web3
- Art
- Multimodal Learning
pretty_name: NFT-70M_transactions
---
# Dataset Card for "NFT-70M_transactions"
## Dataset summary
The *NFT-70M_transactions* dataset is the largest and most up-to-date collection of Non-Fungible Tokens (NFT) transactions between 2021 and 2023 sourced from [OpenSea](https://opensea.io), the leading trading platform in the Web3 ecosystem.
With more than 70M transactions enriched with metadata, this dataset is conceived to support a wide range of tasks, ranging from sequential and transactional data processing/analysis to graph-based modeling of the complex relationships between traders.
Besides, the availability of textual and image contents further amplifies the modeling capabilities and usage opportunities of this dataset, making it a unique and comprehensive multimodal source of information for delving into the NFT landscape.
This dataset can serve as a benchmark for various innovative and impactful tasks within the crypto landscape, such as projecting NFT prices or detecting fraudolent and wash trading activities.
Furthermore, the multimodal nature of the dataset fosters the development of classification models, as well as textual and visual generative models.
## Data anonymization
We point out that the collected NFT transactions and metadata from OpenSea are publicly distributed on blockchain.
For our purposes of re-distribution, we are also committed to ensure non-disclosure of information that might lead to identifying the NFT creators, in order to be compliant with privacy-preserving requirements and to avoid violation of data protection regulations and of property rights.
In this respect, we carried out three actions:
- Values of all variables describing non-sensitive information were kept in their original form;
- Values of all variables describing sensitive information were anonymized, in a one-way, non-revertible mode;
- URLs of image data and textual contents (i.e., NFT images and their descriptions) were replaced by identifiers to numerical vectors that represent an encrypted representation (i.e., embeddings) of the image/text contents obtained via neural network models. Such embeddings are eventually provided in place of their original image and text data,
and can be found in the [**NFT-70M_image**](https://huggingface.co/datasets/MLNTeam-Unical/NFT-70M_image) and [**NFT-70M_text**](https://huggingface.co/datasets/MLNTeam-Unical/NFT-70M_text) supplementary datasets, respectively.
## Data Fields
| Variable | Type | Description | Processing | Notes |
|--------------------------|-------------|-----------------------------------------------------------------------------------------------------------|------------------|-----------------------------------|
| token_id | String | The id of the NFT — this value is unique within the same collection | Anonymized | Original values were replaced by hash-codes |
| num_sales | Integer | A progressive integer indicating the number of successful transactions involving the NFT up to the current timestamp (cf. *tx_timestamp*) | Original | Not sensitive variable |
| nft_name | Vector ID | The name of the NFT | Anonymized | Original values were encrypted via neural textual embedding |
| nft_description | Vector ID | The description of the NFT as provided by the creator | Anonymized | Original values were encrypted via neural textual embedding |
| nft_image | Vector ID | The ID for accessing the NFT image vector | Anonymized | Original values were encrypted via neural visual embedding |
| collection_name | Vector ID | The ID for accessing the Collection name vector | Anonymized | Original values were encrypted via neural textual embedding |
| collection_description | Vector ID | The ID for accessing the Collection description vector | Anonymized | Original values were encrypted via neural textual embedding |
| collection_image | Vector ID | The ID for accessing the Collection image vector | Anonymized | Original values were encrypted via neural visual embedding |
| fees_seller | Float | The absolute amount of fees the seller has gained from this transaction expressed in *token* | Original | Not sensitive variable |
| fees_opensea | Float | The absolute amount of fees OpenSea has gained from this transaction expressed in *token* | Original | Not sensitive variable |
| fees_seller_usd | Float | The absolute amount of fees the seller has gained from this transaction expressed in US dollars (USD) | Original | Not sensitive variable |
| fees_opensea_usd | Float | The absolute amount of fees OpenSea has gained from this transaction expressed in US dollars (USD) | Original | Not sensitive variable |
| payout_collection_address| String | The wallet address where seller fees are deposited | Anonymized | Original values were replaced by hash-codes |
| tx_timestamp | String | Timestamp of the transaction expressed in yyyy-mm-ddTHH:MM:SS | Original | Not sensitive variable |
| price | Float | The price of the transaction expressed in token | Original | Not sensitive variable |
| gain | Float | The gain after fees (i.e., gain = price - fees_opensea * price - fees_seller * price) | Original | Not sensitive variable |
| usd_price | Float | The price of the transaction expressed in US dollars (USD) | Original | Not sensitive variable |
| usd_gain | Float | The difference between the price and the fees expressed in US dollars (USD) | Original | Not sensitive variable |
| token | Categorical | The token type used to pay the transaction | Original | Not sensitive variable |
| to_eth | Float | The conversion rate to convert tokens into Ethereum at the current timestamp, such that eth = price * to_eth | Original | Not sensitive variable |
| to_usd | Float | The conversion rate to convert tokens into US dollars (USD) at the current timestamp, such that usd = price * to_usd | Original | Not sensitive variable |
| from_account | String | The address that sends the payment (i.e., winner/buyer) | Anonymized | Original values were replaced by hash-codes |
| to_account | String | The address that receives the payment (it often corresponds to the contract linked to the asset) | Anonymized | Original values were replaced by hash-codes |
| seller_account | String | The address of the NFT seller | Anonymized | Original values were replaced by hash-codes |
| winner_account | String | The address of the NFT buyer | Anonymized | Original values were replaced by hash-codes |
| contract_address | String | The contract address on the blockchain | Anonymized | Original values were replaced by hash-codes |
| created_date | Timestamp | The date of creation of the contract | Original | Not sensitive variable |
| chain | Categorical | The blockchain where the transaction occurs | Original | Not sensitive variable |
| token_type | Categorical | The schema of the token, i.e., ERC721 or ERC1155 | Original | Not sensitive variable |
| asset_contract_type | Categorical | The asset typology, i.e., non-fungible or semi-fungible | Original | Not sensitive variable |
| asset_type | Categorical | Whether the asset was involved in a simple or bundle transaction | Original | Not sensitive variable |
## How to use
Data provided within this repository can be straightforwardly loaded via the *datasets* library as follows:
```python
from datasets import load_dataset
dataset = load_dataset("MLNTeam-Unical/NFT-70M_transactions")
```
Complementary data involving textual and visual embeddings can be integrated as follows:
```python
from datasets import load_dataset
import numpy as np
transactions_dataset=load_dataset("MLNTeam-Unical/NFT-70M_transactions")
image_dataset=load_dataset("MLNTeam-Unical/NFT-70M_image")
text_dataset=load_dataset("MLNTeam-Unical/NFT-70M_text")
# Mapping from image_id to the row_index within the image dataset
image_id2row_index={int(id):k for k,id in enumerate(image_dataset["train"]["id"])}
# Mapping from text_id to row_index within the text dataset
text_id2row_index={int(id):k for k,id in enumerate(text_dataset["train"]["id"])}
def get_image_embedding(image_id,image_id2row_index,image_dataset):
# If the mapping contains the image, the embedding exists
idx_emb=image_id2row_index.get(int(image_id),None)
if idx_emb:
# If the embedding exists, return it
return np.array(image_dataset["train"].select([idx_emb])["emb"][0])
else:
return None
def get_text_embedding(text_id,text_id2row_index,text_dataset):
# If the mapping contains the text, the embedding exists
idx_emb=text_id2row_index.get(int(text_id),None)
if idx_emb:
# If the embedding exists, return it
return np.array(text_dataset["train"].select([idx_emb])["emb"][0])
else:
return None
### USAGE EXAMPLE ###
# Select transaction_id
transaction_id=120
# Get the image_id (e.g., collection_image or nft_image)
id_image=transactions_dataset["train"].select([transaction_id])["collection_image"][0]
# Get the image
image_embedding=get_image_embedding(id_image,image_id2row_index,image_dataset)
# Get the text_id
id_text=transactions_dataset["train"].select([transaction_id])["collection_description"][0]
# Get the text
text_embedding=get_text_embedding(id_text,text_id2row_index,text_dataset)
```
## Ethical use of data and informed consent
This data repository is made available for research and informational purposes only.
Any finding that might be drawn from the data provided within this repository should be intended to support decision-making regarding actions made on NFTs, and not to replace the human specialists.
*The authors are not responsible for any issues related to trading failures based on the data provided within this repository.*
## Terms of Usage
Please cite the following papers in any research product whose findings are based on the data provided within this repository:
- L. La Cava, D. Costa, A. Tagarelli: SONAR: Web-based Tool for Multimodal Exploration of Non-Fungible Token Inspiration Networks. In: Proc. ACM SIGIR 2023. Taipei, Taiwan, July 23-27 2023. DOI: https://doi.org/10.1145/3539618.3591821
- L. La Cava, D. Costa, A. Tagarelli: Visually Wired NFTs: Exploring the Role of Inspiration in Non-Fungible Tokens. CoRR abs/2303.17031 (2023). DOI: https://doi.org/10.48550/arXiv.2303.17031
- D. Costa, L. La Cava, A. Tagarelli: Show me your NFT and I tell you how it will perform: Multimodal representation learning for NFT selling price prediction. In: Proc. ACM WebConf 2023, pp. 1875-1885. Austin, TX, USA, 30 April 2023 – 4 May 2023. DOI: https://doi.org/10.1145/3543507.3583520
Data within this repository were fetched using the REST APIs provided by OpenSea. You should also acknowledge [OpenSea API]("https://docs.opensea.io/reference/api-overview).
## Liability statement
The authors hereby declare that they are not responsible for any harmful or objectionable content that may be contained within the data provided within this repository.
Users of the dataset are expected to exercise due diligence and responsibility when using the data, including but not limited to:
(i) Content Review: Users should review the dataset's contents carefully and assess its suitability for their intended purposes; (ii) Compliance: Users are responsible for ensuring that their use of the dataset complies with all applicable laws, regulations, and ethical standards;
(iii) Data Processing: Users may need to apply data preprocessing, filtering, or other techniques to remove or address any objectionable or harmful content as needed.
The authors of this dataset disclaim any liability for the accuracy, completeness, or suitability of the data and shall not be held responsible for any consequences resulting from the use or misuse of the dataset.
*By accessing and using this dataset, users acknowledge and accept this disclaimer.* |
dbaezaj/lince_ner_dataset | ---
dataset_info:
features:
- name: id
dtype: int64
- name: words
sequence: string
- name: lid
sequence: string
- name: labels
sequence: string
--- |
open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge | ---
pretty_name: Evaluation run of fionazhang/fine-tune-mistral-environment-merge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fionazhang/fine-tune-mistral-environment-merge](https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T01:47:21.122290](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge/blob/main/results_2024-01-29T01-47-21.122290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6354317163514616,\n\
\ \"acc_stderr\": 0.032307072675482454,\n \"acc_norm\": 0.6419311935208026,\n\
\ \"acc_norm_stderr\": 0.03296064982960984,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4397408572877062,\n\
\ \"mc2_stderr\": 0.014194431681893268\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759086\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.635929097789285,\n\
\ \"acc_stderr\": 0.00480185288132974,\n \"acc_norm\": 0.8365863373829915,\n\
\ \"acc_norm_stderr\": 0.0036898701424130753\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.016062290671110462,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.016062290671110462\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n\
\ \"acc_stderr\": 0.012698825252435108,\n \"acc_norm\": 0.4471968709256845,\n\
\ \"acc_norm_stderr\": 0.012698825252435108\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000314,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000314\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4397408572877062,\n\
\ \"mc2_stderr\": 0.014194431681893268\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710681\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \
\ \"acc_stderr\": 0.013159909755930323\n }\n}\n```"
repo_url: https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|arc:challenge|25_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|gsm8k|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hellaswag|10_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T01-47-21.122290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- '**/details_harness|winogrande|5_2024-01-29T01-47-21.122290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T01-47-21.122290.parquet'
- config_name: results
data_files:
- split: 2024_01_29T01_47_21.122290
path:
- results_2024-01-29T01-47-21.122290.parquet
- split: latest
path:
- results_2024-01-29T01-47-21.122290.parquet
---
# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-environment-merge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fionazhang/fine-tune-mistral-environment-merge](https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T01:47:21.122290](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge/blob/main/results_2024-01-29T01-47-21.122290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6354317163514616,
"acc_stderr": 0.032307072675482454,
"acc_norm": 0.6419311935208026,
"acc_norm_stderr": 0.03296064982960984,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4397408572877062,
"mc2_stderr": 0.014194431681893268
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759086
},
"harness|hellaswag|10": {
"acc": 0.635929097789285,
"acc_stderr": 0.00480185288132974,
"acc_norm": 0.8365863373829915,
"acc_norm_stderr": 0.0036898701424130753
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775614,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739154,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.016062290671110462,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.016062290671110462
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632938,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632938
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435108,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435108
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000314,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000314
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4397408572877062,
"mc2_stderr": 0.014194431681893268
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710681
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Megnis/python_code_instructions_27k_Saiga | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 83038004
num_examples: 27224
download_size: 30871544
dataset_size: 83038004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EmmaGthn/Moji_balanced | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: gender
dtype: int64
splits:
- name: train
num_bytes: 6618798
num_examples: 53016
- name: test
num_bytes: 256593
num_examples: 2000
download_size: 2582045
dataset_size: 6875391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/galleon_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of galleon/ガレヲン (Granblue Fantasy)
This is the dataset of galleon/ガレヲン (Granblue Fantasy), containing 325 images and their tags.
The core tags of this character are `brown_hair, long_hair, animal_ears, horns, breasts, pointy_ears, extra_ears, bangs, multicolored_hair, large_breasts, streaked_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 325 | 557.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 325 | 293.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 815 | 666.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 325 | 483.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 815 | 988.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/galleon_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, black_dress, closed_eyes, detached_sleeves, frilled_sleeves, solo, white_gloves, bare_shoulders, blush |
| 1 | 5 |  |  |  |  |  | 1girl, closed_eyes, detached_sleeves, frilled_sleeves, solo, white_gloves, asymmetrical_hair, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, asymmetrical_hair, asymmetrical_legwear, closed_eyes, detached_sleeves, frilled_sleeves, solo, thigh_strap, white_gloves, pelvic_curtain, black_dress, full_body, hair_between_eyes, single_thighhigh |
| 3 | 16 |  |  |  |  |  | 1boy, 1girl, closed_eyes, hetero, solo_focus, blush, detached_sleeves, white_gloves, nipples, paizuri, huge_breasts, mosaic_censoring, kissing_penis, nude |
| 4 | 17 |  |  |  |  |  | 1girl, black_dress, blindfold, cleavage, solo, blue_hair, smile, thigh_strap, long_sleeves, mask, nail_polish, closed_mouth, facing_viewer, parted_lips |
| 5 | 9 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, closed_eyes, navel, solo, bikini, hair_between_eyes, thighs, blush, blue_hair, collarbone, thigh_strap, wet |
| 6 | 11 |  |  |  |  |  | 1girl, closed_eyes, solo, cleavage, collared_shirt, long_sleeves, white_shirt, blue_hair, blush, smile, collarbone, hair_between_eyes, naked_shirt, navel, sitting |
| 7 | 5 |  |  |  |  |  | 1girl, closed_eyes, completely_nude, hair_between_eyes, solo, smile, collarbone, nipples, artist_name, barefoot, blue_hair, blush, closed_mouth, lips, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | closed_eyes | detached_sleeves | frilled_sleeves | solo | white_gloves | bare_shoulders | blush | asymmetrical_hair | upper_body | asymmetrical_legwear | thigh_strap | pelvic_curtain | full_body | hair_between_eyes | single_thighhigh | 1boy | hetero | solo_focus | nipples | paizuri | huge_breasts | mosaic_censoring | kissing_penis | nude | blindfold | cleavage | blue_hair | smile | long_sleeves | mask | nail_polish | closed_mouth | facing_viewer | parted_lips | navel | bikini | thighs | collarbone | wet | collared_shirt | white_shirt | naked_shirt | sitting | completely_nude | artist_name | barefoot | lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:-------------------|:------------------|:-------|:---------------|:-----------------|:--------|:--------------------|:-------------|:-----------------------|:--------------|:-----------------|:------------|:--------------------|:-------------------|:-------|:---------|:-------------|:----------|:----------|:---------------|:-------------------|:----------------|:-------|:------------|:-----------|:------------|:--------|:---------------|:-------|:--------------|:---------------|:----------------|:--------------|:--------|:---------|:---------|:-------------|:------|:-----------------|:--------------|:--------------|:----------|:------------------|:--------------|:-----------|:-------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | | X | X | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | | X | | X | X | | | | X | | | X | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | X | | | X | | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | | | | | | X | | | X | | X | X | X | X | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | X | | | X | | | | | | | X | | | | | X | | | | | | | | X | X | | | | X | | | X | | | X | | | | | | X | X | X | X |
|
joseluhf11/oct-object-detection-v2-merge | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
splits:
- name: train
num_bytes: 153967507.25
num_examples: 1246
download_size: 71637288
dataset_size: 153967507.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oct-object-detection-v2-merge"
Dataset is composed of images with multiples object detection box in coco format (x,y,w,h). Images are OCT (type of eye scaner) with boxes indicating some features associated to AMD disease.
Changes from from v1 are images are grouped into a single row for the same class detection object, and also join with merge method overlapping boxes. merge means, get the whole area covered by both boxes.
[Source datataset](https://doi.org/10.1101/2023.03.29.534704) |
chirunder/MixSnips_for_DecoderOnly_90-10_split-HALF | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 17739996.800127994
num_examples: 22500
- name: test
num_bytes: 1971899.199872005
num_examples: 2501
download_size: 7061034
dataset_size: 19711896.0
---
# Dataset Card for "MixSnips_for_DecoderOnly_90-10_split-HALF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlpUc3mStudents/mental-risk-d | ---
dataset_info:
features:
- name: subject_id
dtype: string
- name: id_message
dtype: int64
- name: date
dtype: string
- name: message
dtype: string
- name: suffer_in_favour
dtype: float64
- name: suffer_against
dtype: float64
- name: suffer_other
dtype: float64
- name: control
dtype: float64
splits:
- name: train
num_bytes: 949991
num_examples: 6248
- name: test
num_bytes: 91047
num_examples: 624
download_size: 486498
dataset_size: 1041038
---
# Dataset Card for "mental-risk-d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.