datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AdapterOcean/med_alpaca_standardized_cluster_71_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17002032
num_examples: 35607
download_size: 9028179
dataset_size: 17002032
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_71_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thak123/konkani-speech-text-collection | ---
license: mit
---
|
nthngdy/pile_small_miniLM | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: perplexity_score
dtype: float64
- name: pile_set_name
dtype: string
- name: emb
sequence: float32
splits:
- name: train
num_bytes: 760056668
num_examples: 100000
download_size: 545226370
dataset_size: 760056668
---
# Dataset Card for "pile_small_miniLM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zh-tw-llm-dv-dv/zh-tw-llm-dev-sample-ta8k-d40d11-only_embeddings-tr__alp-cd978d-c2048 | ---
dataset_info:
dataset_size: 1430500.0
download_size: 483276
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- dtype: string
name: preview
splits:
- name: train
num_bytes: 1430500.0
num_examples: 400
---
# zh-tw-llm-dev-sample-ta8k-d40d11-only_embeddings-tr__alp-cd978d-c2048
This dataset is a part of the `zh-tw-llm-dev` project.
* Tokenizer: `zh-tw-llm-dev-tokenizer-a8k-d40d11`
* Built with: `translations`, `wikipedia`, `alpaca`
* Rows: `400`
* Max length: `2048`
* Full config:
```json
{"build_with": ["translations", "wikipedia", "alpaca"], "preview_length": 256, "translations_settings": {"source_dataset": "zetavg/coct-en-zh-tw-translations-twp-300k", "lang_1_key": "en", "lang_2_key": "ch", "templates": ["English: {lang_1}\nChinese: {lang_2}", "Chinese: {lang_2}\nEnglish: {lang_1}"], "rows_limit": 100}, "wikipedia_settings": {"source_dataset": "zetavg/zh-tw-wikipedia", "exclude": [{"match": "小行星", "in": "markdown", "in_range": [0, 40]}, {"match": "是中華人民共和國", "in": "markdown", "in_range": [0, 80]}], "rows_limit": 100}, "alpaca_settings": {"source_dataset": "zetavg/traditional-chinese-alpaca-en-align", "template": "short", "train_on_inputs": false, "rows_limit": 100}}
``` |
open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01 | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-16B-v2.01
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-16B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T17:22:21.064406](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01/blob/main/results_2023-12-09T17-22-21.064406.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291644824323459,\n\
\ \"acc_stderr\": 0.03274488948991142,\n \"acc_norm\": 0.6364297874777888,\n\
\ \"acc_norm_stderr\": 0.033437033730375994,\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6453446072596959,\n\
\ \"mc2_stderr\": 0.015246087407331466\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n\
\ \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063232\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6387173869747063,\n\
\ \"acc_stderr\": 0.004793904922401889,\n \"acc_norm\": 0.8292172873929496,\n\
\ \"acc_norm_stderr\": 0.003755498941781851\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.014385525076611567,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.014385525076611567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787686,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787686\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683903,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683903\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6453446072596959,\n\
\ \"mc2_stderr\": 0.015246087407331466\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881578\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2532221379833207,\n \
\ \"acc_stderr\": 0.011978125194299687\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|arc:challenge|25_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|gsm8k|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hellaswag|10_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-08-47.844785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-22-21.064406.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- '**/details_harness|winogrande|5_2023-12-08T02-08-47.844785.parquet'
- split: 2023_12_09T17_22_21.064406
path:
- '**/details_harness|winogrande|5_2023-12-09T17-22-21.064406.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T17-22-21.064406.parquet'
- config_name: results
data_files:
- split: 2023_12_08T02_08_47.844785
path:
- results_2023-12-08T02-08-47.844785.parquet
- split: 2023_12_09T17_22_21.064406
path:
- results_2023-12-09T17-22-21.064406.parquet
- split: latest
path:
- results_2023-12-09T17-22-21.064406.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-16B-v2.01
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-16B-v2.01](https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:22:21.064406](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01/blob/main/results_2023-12-09T17-22-21.064406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291644824323459,
"acc_stderr": 0.03274488948991142,
"acc_norm": 0.6364297874777888,
"acc_norm_stderr": 0.033437033730375994,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6453446072596959,
"mc2_stderr": 0.015246087407331466
},
"harness|arc:challenge|25": {
"acc": 0.6296928327645052,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063232
},
"harness|hellaswag|10": {
"acc": 0.6387173869747063,
"acc_stderr": 0.004793904922401889,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.003755498941781851
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611567,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787686,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787686
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6453446072596959,
"mc2_stderr": 0.015246087407331466
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881578
},
"harness|gsm8k|5": {
"acc": 0.2532221379833207,
"acc_stderr": 0.011978125194299687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/yue_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yue (Lapis Re:LiGHTs)
This is the dataset of Yue (Lapis Re:LiGHTs), containing 71 images and their tags.
The core tags of this character are `long_hair, black_hair, bangs, purple_eyes, breasts, hat, hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 71 | 40.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yue_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 71 | 33.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yue_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 149 | 64.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yue_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 71 | 40.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yue_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 149 | 73.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yue_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yue_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_shorts, midriff, solo, thigh_strap, black_gloves, black_headwear, navel, short_shorts, belt, looking_at_viewer, shirt, sleeveless, standing, black_footwear, blunt_bangs, boots |
| 1 | 5 |  |  |  |  |  | 1girl, double_bun, looking_at_viewer, solo, closed_mouth, upper_body, indoors, parody, sky |
| 2 | 7 |  |  |  |  |  | 1girl, closed_mouth, double_bun, solo, ascot, closed_eyes, frills, short_sleeves, standing, collarbone, indoors, white_dress, blurry |
| 3 | 10 |  |  |  |  |  | 1girl, solo, anime_coloring, looking_at_viewer, purple_hair, school_uniform, blurry, outdoors, parody, sidelocks, sky, closed_mouth, double_bun |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shorts | midriff | solo | thigh_strap | black_gloves | black_headwear | navel | short_shorts | belt | looking_at_viewer | shirt | sleeveless | standing | black_footwear | blunt_bangs | boots | double_bun | closed_mouth | upper_body | indoors | parody | sky | ascot | closed_eyes | frills | short_sleeves | collarbone | white_dress | blurry | anime_coloring | purple_hair | school_uniform | outdoors | sidelocks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------|:-------|:--------------|:---------------|:-----------------|:--------|:---------------|:-------|:--------------------|:--------|:-------------|:-----------|:-----------------|:--------------|:--------|:-------------|:---------------|:-------------|:----------|:---------|:------|:--------|:--------------|:---------|:----------------|:-------------|:--------------|:---------|:-----------------|:--------------|:-----------------|:-----------|:------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | | | | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | | X | X | | X | | | X | X | X | X | X | X | X | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | X | | | | | | | X | | | | | | | X | X | | | X | X | | | | | | | X | X | X | X | X | X |
|
asgaardlab/GamePhysics | ---
license: creativeml-openrail-m
---
# GamePhysics Dataset
[](https://asgaardlab.github.io/CLIPxGamePhysics/)
[](https://arxiv.org/abs/2203.11096)
[](https://huggingface.co/spaces/taesiri/CLIPxGamePhysics)
The GamePhysics dataset is a collection of gameplay bug videos sourced from the [GamePhysics subreddit](https://www.reddit.com/r/GamePhysics/).
## Sample videos
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/9rqabp.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/g5pm35.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/6xplqg.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/4jirzj.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
|
Cubpaw/voxelgym_5c_42x42_250000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
- name: path_label
dtype: image
- name: path_rgb_label
dtype: image
splits:
- name: train
num_bytes: 186844000.0
num_examples: 200000
- name: validation
num_bytes: 46250450.0
num_examples: 50000
download_size: 177114030
dataset_size: 233094450.0
---
# Dataset Card for "voxelgym_5c_42x42_250000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf | ---
pretty_name: Evaluation run of meta-llama/Llama-2-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 127 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 16 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T13:00:54.924067](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf/blob/main/results_2023-12-02T13-00-54.924067.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.14480667172100076,\n\
\ \"acc_stderr\": 0.009693234799052708\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.14480667172100076,\n \"acc_stderr\": 0.009693234799052708\n\
\ }\n}\n```"
repo_url: https://huggingface.co/meta-llama/Llama-2-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|arc:challenge|25_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|arc:challenge|25_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|arc:challenge|25_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_drop_0
data_files:
- split: 2023_09_14T20_50_38.766533
path:
- '**/details_harness|drop|0_2023-09-14T20-50-38.766533.parquet'
- split: 2023_09_15T08_35_01.075146
path:
- '**/details_harness|drop|0_2023-09-15T08-35-01.075146.parquet'
- split: latest
path:
- '**/details_harness|drop|0_2023-09-15T08-35-01.075146.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_08T17_00_44.389859
path:
- '**/details_harness|drop|3_2023-09-08T17-00-44.389859.parquet'
- split: 2023_09_09T12_32_30.613622
path:
- '**/details_harness|drop|3_2023-09-09T12-32-30.613622.parquet'
- split: 2023_09_20T14_39_46.791628
path:
- '**/details_harness|drop|3_2023-09-20T14-39-46.791628.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-20T14-39-46.791628.parquet'
- config_name: harness_gsm8k_0
data_files:
- split: 2023_09_15T08_35_01.075146
path:
- '**/details_harness|gsm8k|0_2023-09-15T08-35-01.075146.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|0_2023-09-15T08-35-01.075146.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_08T17_00_44.389859
path:
- '**/details_harness|gsm8k|5_2023-09-08T17-00-44.389859.parquet'
- split: 2023_09_09T12_32_30.613622
path:
- '**/details_harness|gsm8k|5_2023-09-09T12-32-30.613622.parquet'
- split: 2023_09_20T14_39_46.791628
path:
- '**/details_harness|gsm8k|5_2023-09-20T14-39-46.791628.parquet'
- split: 2023_12_02T13_00_06.695936
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-00-06.695936.parquet'
- split: 2023_12_02T13_00_54.924067
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-00-54.924067.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-00-54.924067.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hellaswag|10_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hellaswag|10_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hellaswag|10_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_0
data_files:
- split: 2023_08_21T20_09_03.352670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:09:03.352670.parquet'
- split: 2023_08_21T20_15_29.093529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:15:29.093529.parquet'
- split: 2023_08_21T20_20_08.261679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:20:08.261679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:20:08.261679.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T16:35:46.942696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_21T17_55_50.567332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T17:55:50.567332.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:19:51.585793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:54:59.197645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-53-02.418861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_0
data_files:
- split: 2023_08_21T20_09_03.352670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:09:03.352670.parquet'
- split: 2023_08_21T20_15_29.093529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:15:29.093529.parquet'
- split: 2023_08_21T20_20_08.261679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:20:08.261679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T20:20:08.261679.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_21T17_55_50.567332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T17:55:50.567332.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T16:35:46.942696.parquet'
- split: 2023_08_24T09_19_51.585793
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T09:19:51.585793.parquet'
- split: 2023_08_29T17_54_59.197645
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:54:59.197645.parquet'
- split: 2023_09_15T09_53_02.418861
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T09-53-02.418861.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T09-53-02.418861.parquet'
- config_name: harness_winogrande_0
data_files:
- split: 2023_09_15T08_35_01.075146
path:
- '**/details_harness|winogrande|0_2023-09-15T08-35-01.075146.parquet'
- split: latest
path:
- '**/details_harness|winogrande|0_2023-09-15T08-35-01.075146.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_08T17_00_44.389859
path:
- '**/details_harness|winogrande|5_2023-09-08T17-00-44.389859.parquet'
- split: 2023_09_09T12_32_30.613622
path:
- '**/details_harness|winogrande|5_2023-09-09T12-32-30.613622.parquet'
- split: 2023_09_20T14_39_46.791628
path:
- '**/details_harness|winogrande|5_2023-09-20T14-39-46.791628.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-20T14-39-46.791628.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:52:01.926454.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:52:01.926454.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T19_52_01.926454
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:52:01.926454.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:52:01.926454.parquet'
- config_name: results
data_files:
- split: 2023_08_19T16_35_46.942696
path:
- results_2023-08-19T16:35:46.942696.parquet
- split: 2023_08_21T17_55_50.567332
path:
- results_2023-08-21T17:55:50.567332.parquet
- split: 2023_08_21T20_09_03.352670
path:
- results_2023-08-21T20:09:03.352670.parquet
- split: 2023_08_21T20_15_29.093529
path:
- results_2023-08-21T20:15:29.093529.parquet
- split: 2023_08_21T20_20_08.261679
path:
- results_2023-08-21T20:20:08.261679.parquet
- split: 2023_08_24T09_19_51.585793
path:
- results_2023-08-24T09:19:51.585793.parquet
- split: 2023_08_28T19_52_01.926454
path:
- results_2023-08-28T19:52:01.926454.parquet
- split: 2023_08_29T17_54_59.197645
path:
- results_2023-08-29T17:54:59.197645.parquet
- split: 2023_09_08T17_00_44.389859
path:
- results_2023-09-08T17-00-44.389859.parquet
- split: 2023_09_09T12_32_30.613622
path:
- results_2023-09-09T12-32-30.613622.parquet
- split: 2023_09_14T20_50_38.766533
path:
- results_2023-09-14T20-50-38.766533.parquet
- split: 2023_09_15T08_35_01.075146
path:
- results_2023-09-15T08-35-01.075146.parquet
- split: 2023_09_15T09_53_02.418861
path:
- results_2023-09-15T09-53-02.418861.parquet
- split: 2023_09_20T14_39_46.791628
path:
- results_2023-09-20T14-39-46.791628.parquet
- split: 2023_12_02T13_00_06.695936
path:
- results_2023-12-02T13-00-06.695936.parquet
- split: 2023_12_02T13_00_54.924067
path:
- results_2023-12-02T13-00-54.924067.parquet
- split: latest
path:
- results_2023-12-02T13-00-54.924067.parquet
---
# Dataset Card for Evaluation run of meta-llama/Llama-2-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-llama/Llama-2-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 127 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 16 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:00:54.924067](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf/blob/main/results_2023-12-02T13-00-54.924067.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.14480667172100076,
"acc_stderr": 0.009693234799052708
},
"harness|gsm8k|5": {
"acc": 0.14480667172100076,
"acc_stderr": 0.009693234799052708
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JovialValley/syllable_totalMapped2 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 108903948
num_examples: 390
- name: test
num_bytes: 26971340
num_examples: 97
download_size: 136776788
dataset_size: 135875288
---
# Dataset Card for "syllable_totalMapped2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdiOO7/SalesKRA | ---
license: apache-2.0
language:
- en
tags:
- code
pretty_name: Sales KRA Data
size_categories:
- n<1K
--- |
paweldali/Tylercrimetime | ---
license: unknown
---
|
stanmalkinson199/KyleBroflovskiClassic | ---
license: openrail
---
|
brainer/ecg_labeled | ---
dataset_info:
features:
- name: image
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: pixel_values
sequence:
sequence:
sequence: uint8
- name: image_id
dtype: int64
- name: image_path
dtype: string
- name: objects
struct:
- name: area
sequence: float64
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
- name: id
sequence: int64
splits:
- name: train
num_bytes: 441658472.0
num_examples: 150
- name: test
num_bytes: 55907924.0
num_examples: 19
- name: valid
num_bytes: 50005135.0
num_examples: 17
download_size: 82858518
dataset_size: 547571531.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
DiegoRoberto10/amandamelhor | ---
license: openrail
---
|
open-llm-leaderboard/details_marcel__phi-2-openhermes-30k | ---
pretty_name: Evaluation run of marcel/phi-2-openhermes-30k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcel/phi-2-openhermes-30k](https://huggingface.co/marcel/phi-2-openhermes-30k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcel__phi-2-openhermes-30k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T18:54:53.186103](https://huggingface.co/datasets/open-llm-leaderboard/details_marcel__phi-2-openhermes-30k/blob/main/results_2024-02-01T18-54-53.186103.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5732804491341541,\n\
\ \"acc_stderr\": 0.03371240537561013,\n \"acc_norm\": 0.5752830563376676,\n\
\ \"acc_norm_stderr\": 0.034401894257503625,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.45379798730359744,\n\
\ \"mc2_stderr\": 0.015158432314849521\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848025,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892896\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.569308902609042,\n\
\ \"acc_stderr\": 0.004941609820763585,\n \"acc_norm\": 0.7471619199362677,\n\
\ \"acc_norm_stderr\": 0.004337506344899918\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.029445175328199586,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.029445175328199586\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131796,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042757,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460288,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306397,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306397\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.01442229220480886,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.01442229220480886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.02726429759980402,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.02726429759980402\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166848,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166848\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557308,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557308\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02008736207670285,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02008736207670285\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786865,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786865\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.038922121953330446,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.038922121953330446\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.45379798730359744,\n\
\ \"mc2_stderr\": 0.015158432314849521\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516146\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.49052312357846856,\n \
\ \"acc_stderr\": 0.013770010651168823\n }\n}\n```"
repo_url: https://huggingface.co/marcel/phi-2-openhermes-30k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|arc:challenge|25_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|gsm8k|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hellaswag|10_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T18-54-53.186103.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- '**/details_harness|winogrande|5_2024-02-01T18-54-53.186103.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T18-54-53.186103.parquet'
- config_name: results
data_files:
- split: 2024_02_01T18_54_53.186103
path:
- results_2024-02-01T18-54-53.186103.parquet
- split: latest
path:
- results_2024-02-01T18-54-53.186103.parquet
---
# Dataset Card for Evaluation run of marcel/phi-2-openhermes-30k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [marcel/phi-2-openhermes-30k](https://huggingface.co/marcel/phi-2-openhermes-30k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcel__phi-2-openhermes-30k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T18:54:53.186103](https://huggingface.co/datasets/open-llm-leaderboard/details_marcel__phi-2-openhermes-30k/blob/main/results_2024-02-01T18-54-53.186103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5732804491341541,
"acc_stderr": 0.03371240537561013,
"acc_norm": 0.5752830563376676,
"acc_norm_stderr": 0.034401894257503625,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.45379798730359744,
"mc2_stderr": 0.015158432314849521
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848025,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892896
},
"harness|hellaswag|10": {
"acc": 0.569308902609042,
"acc_stderr": 0.004941609820763585,
"acc_norm": 0.7471619199362677,
"acc_norm_stderr": 0.004337506344899918
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.029445175328199586,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.029445175328199586
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131796,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042757,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042757
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460288,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648371,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306397,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306397
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.01442229220480886,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.01442229220480886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159614,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159614
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.02726429759980402,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.02726429759980402
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027125115513166848,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027125115513166848
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557308,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557308
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02008736207670285,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02008736207670285
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786865,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786865
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.038922121953330446,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.038922121953330446
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.45379798730359744,
"mc2_stderr": 0.015158432314849521
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516146
},
"harness|gsm8k|5": {
"acc": 0.49052312357846856,
"acc_stderr": 0.013770010651168823
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zdmc23/sft-1 | ---
dataset_info:
features:
- name: ts
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: model
dtype: string
splits:
- name: train
num_bytes: 351
num_examples: 2
download_size: 3153
dataset_size: 351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepghs/3dbooru_full | ---
license: mit
task_categories:
- image-classification
- zero-shot-image-classification
- text-to-image
language:
- en
tags:
- art
- anime
- not-for-all-audiences
size_categories:
- 100K<n<1M
annotations_creators:
- no-annotation
source_datasets:
- behoimi
---
# Behoimi Full Dataset
This is the full dataset of [behoimi.org](http://behoimi.org/) (also named 3dbooru). And all the original images are maintained here.
If you need to download the files from the urls in the `table.csv`. Remember to **add `Referer: http://behoimi.org/` into request headers** when downloading them, or your requests will be forbidden.
# Information
## Images
There are 650256 images in total. The maximum ID of these images is 653114. Last updated at `2024-04-02 04:02:35 UTC`.
These are the information of recent 50 images:
| id | filename | width | height | type | tags | file_url |
|-------:|:-----------|--------:|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------|
| 653114 | 653114.jpg | 683 | 1024 | image/jpeg | 2b blonde_hair christmas kaddi_cosplay nier nier_automata red_dress red_legwear thighhighs | http://behoimi.org/data/cb/59/cb5923e439f81e492c24a84ff151cc27.jpg |
| 653113 | 653113.jpg | 2048 | 1370 | image/jpeg | breath_of_fire cosplay nina skirt | http://behoimi.org/data/3e/b3/3eb32eb41cff3111551ff0c14e4825d5.jpg |
| 653111 | 653111.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/15/18/151837c3417739682d6404a41787d074.jpg |
| 653110 | 653110.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails | http://behoimi.org/data/8a/92/8a92acdface6eba3565611f5b1516a80.jpg |
| 653109 | 653109.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails | http://behoimi.org/data/ee/87/ee87226eae6604cc69351bf91a3876ff.jpg |
| 653108 | 653108.jpg | 800 | 1200 | image/jpeg | apple_pie_princess ass bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/d3/a2/d3a2d061fb4dee5f378a5d54c638958f.jpg |
| 653107 | 653107.jpg | 1200 | 800 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/c2/04/c204ac11aec97df52b513e0cf193c908.jpg |
| 653106 | 653106.jpg | 800 | 1200 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster sakura_ema thighhighs tiara totoki_airi twintails white_legwear zettai_ryouiki | http://behoimi.org/data/df/4b/df4b62e8f912259ed6160a6ebc0f6cd8.jpg |
| 653105 | 653105.jpg | 1200 | 800 | image/jpeg | apple_pie_princess ass cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/83/af/83af37b90b18d196b988478d884825a7.jpg |
| 653104 | 653104.jpg | 1200 | 800 | image/jpeg | apple_pie_princess ass cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/be/6e/be6e3016e59edcf6f9b45987a70443dc.jpg |
| 653103 | 653103.jpg | 1200 | 800 | image/jpeg | apple_pie_princess ass cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/b1/74/b174edd0ab101a4f5fc6ef73fba8b169.jpg |
| 653102 | 653102.jpg | 1200 | 800 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/4c/d6/4cd6ad1bb24425f5aedca909210a4cff.jpg |
| 653101 | 653101.jpg | 1200 | 800 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/e6/cb/e6cb4a7238b449ed20c3b196107fed5d.jpg |
| 653100 | 653100.jpg | 1200 | 800 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/c2/3c/c23c64815bfabd466ff033658e922909.jpg |
| 653099 | 653099.jpg | 800 | 1200 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster sakura_ema thighhighs tiara totoki_airi twintails white_legwear zettai_ryouiki | http://behoimi.org/data/a9/12/a91212973df552c2bda5426e1688f7c1.jpg |
| 653098 | 653098.jpg | 1200 | 800 | image/jpeg | apple_pie_princess cosplay dress gloves hairbow idolmaster sakura_ema thighhighs tiara totoki_airi twintails white_legwear zettai_ryouiki | http://behoimi.org/data/b7/6a/b76a59fd7d145df83ba0fd7b9bb84537.jpg |
| 653097 | 653097.jpg | 800 | 1200 | image/jpeg | apple_pie_princess ass cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/8d/8a/8d8ad751323b699520b082a400e4c2b0.jpg |
| 653096 | 653096.jpg | 1200 | 800 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster sakura_ema tiara totoki_airi twintails | http://behoimi.org/data/01/5a/015a06f0fce63e092698172719a390af.jpg |
| 653095 | 653095.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed bottomless cosplay dress gloves hairbow idolmaster sakura_ema thighhighs tiara totoki_airi twintails | http://behoimi.org/data/ab/f5/abf57a83b7e65e0ea7a1436e61a6cf7a.jpg |
| 653094 | 653094.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed bottomless cosplay dress gloves hairbow idolmaster sakura_ema thighhighs tiara totoki_airi twintails | http://behoimi.org/data/56/13/5613aaeb90e35d9c6e3786fea3458479.jpg |
| 653093 | 653093.jpg | 800 | 1200 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/a1/9b/a19b246e4d7f128a913b2e84137ddd37.jpg |
| 653092 | 653092.jpg | 1200 | 800 | image/jpeg | apple_pie_princess bed cosplay dress gloves hairbow idolmaster panties sakura_ema thighhighs tiara totoki_airi twintails white_legwear | http://behoimi.org/data/9d/f7/9df7d13b51229f87ace4df795509bce4.jpg |
| 653091 | 653091.jpg | 1620 | 1080 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) stuffed_animal sword_art_online syringe teddy_bear | http://behoimi.org/data/5f/c8/5fc8e037ced6c265fccf1294746d35e3.jpg |
| 653090 | 653090.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) stuffed_animal sword_art_online syringe teddy_bear white_legwear | http://behoimi.org/data/95/ba/95bafb11a56ee070e756fd1b92621a36.jpg |
| 653089 | 653089.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) stuffed_animal sword_art_online syringe teddy_bear | http://behoimi.org/data/88/12/88122697e61323b25fd58dd1f89e1d82.jpg |
| 653088 | 653088.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) stuffed_animal sword_art_online syringe teddy_bear | http://behoimi.org/data/65/3f/653f5ff6f67532a7f66a8672f51d164d.jpg |
| 653087 | 653087.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/16/ff/16ffbc9053d681060de51589b3b8cd26.jpg |
| 653086 | 653086.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair bed cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/88/9a/889a02b960877105171b48b250e309b7.jpg |
| 653085 | 653085.jpg | 1620 | 1080 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online | http://behoimi.org/data/e3/fe/e3fef42b9d4ee024ff630c85fed0917a.jpg |
| 653084 | 653084.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/cb/34/cb34f70b5722d21c600f8dd1adccbc42.jpg |
| 653083 | 653083.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/02/77/02777d9745ce2eaf7afb6800d360f624.jpg |
| 653082 | 653082.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/95/18/9518c856241d064e26400357260f7b03.jpg |
| 653081 | 653081.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/3f/be/3fbe4fe678004e7f7046326fb7ed7f71.jpg |
| 653080 | 653080.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online syringe | http://behoimi.org/data/74/9c/749c2de42215375528a4449bff037792.jpg |
| 653079 | 653079.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online syringe | http://behoimi.org/data/40/94/4094cc0ebb7ebe4bd8486545d481d5ea.jpg |
| 653078 | 653078.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online syringe | http://behoimi.org/data/17/1d/171de6d4641e3f1cba547c16b8793601.jpg |
| 653077 | 653077.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online syringe | http://behoimi.org/data/9e/05/9e05f419071885930595afaf8c8acc4e.jpg |
| 653076 | 653076.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online syringe | http://behoimi.org/data/93/3d/933d3955a27bc8dba2af410ecf45b29e.jpg |
| 653075 | 653075.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online | http://behoimi.org/data/84/04/84045c19f0f86404a829b21a85ef441f.jpg |
| 653074 | 653074.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform pantyhose shinon_(sao) sword_art_online white_legwear | http://behoimi.org/data/6c/04/6c04f177aacd0b0fcb515b93d9db3cef.jpg |
| 653073 | 653073.jpg | 1080 | 1620 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online | http://behoimi.org/data/dd/5d/dd5d50229e1e93b9776847c4ec1ab24b.jpg |
| 653072 | 653072.jpg | 1620 | 1080 | image/jpeg | apron aqua_hair cosplay dress ely elysion hairband nurse nurse_uniform shinon_(sao) sword_art_online | http://behoimi.org/data/17/0d/170d163572c87f00b39b03ca302589c8.jpg |
| 653071 | 653071.jpg | 1200 | 1800 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves kneehighs mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/ce/67/ce6760f937421f452afb5f6126941541.jpg |
| 653070 | 653070.jpg | 1200 | 1800 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves kneehighs mizuno_ami nagisa_mark-02 panties pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/26/a0/26a089722f7044ccdb4f410c5626db94.jpg |
| 653069 | 653069.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves kneehighs mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/2f/dd/2fdda24f31d064cfe604f602b8c6b936.jpg |
| 653068 | 653068.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves kneehighs mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/9e/f9/9ef9fa4feca10c7542c16a4fbb5d1255.jpg |
| 653067 | 653067.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/0b/96/0b969e6274a9a986c188e568ae645170.jpg |
| 653066 | 653066.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/30/c8/30c8b1002ed344209f16cda3a594407f.jpg |
| 653065 | 653065.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves mizuno_ami nagisa_mark-02 pleated_skirt sailor_mercury sailor_uniform sapphire school_uniform skirt | http://behoimi.org/data/e1/3b/e13b4588362b948213b312a1c6a40488.jpg |
| 653064 | 653064.jpg | 1800 | 1200 | image/jpeg | bed bishoujo_senshi_sailor_moon blue_hair choker circlet cosplay elbow_gloves gloves mizuno_ami nagisa_mark-02 sailor_mercury sailor_uniform sapphire school_uniform | http://behoimi.org/data/92/20/9220e1963d619b7e35a5634eeeacc03b.jpg |
## Tags
There are 15854 tags in total.
These are the top 30 tags of type `general (0)`:
| id | tag | type | count | ambiguous |
|------:|:-------------------|-------:|--------:|:------------|
| 8 | cosplay | 0 | 266680 | False |
| 7897 | swimsuit | 0 | 239558 | False |
| 218 | cleavage | 0 | 165746 | False |
| 759 | skirt | 0 | 163913 | False |
| 57 | bikini | 0 | 142117 | False |
| 1534 | blouse | 0 | 123938 | False |
| 6 | thighhighs | 0 | 123757 | False |
| 83 | dress | 0 | 105380 | False |
| 96 | pantyhose | 0 | 102200 | False |
| 555 | pleated_skirt | 0 | 81522 | False |
| 81 | school_uniform | 0 | 80063 | False |
| 12 | twintails | 0 | 79191 | False |
| 711 | miniskirt | 0 | 71329 | False |
| 186 | bed | 0 | 64042 | False |
| 1058 | side-tie_bikini | 0 | 62353 | False |
| 43 | ass | 0 | 58208 | False |
| 543 | zettai_ryouiki | 0 | 58023 | False |
| 20 | blonde_hair | 0 | 57576 | False |
| 102 | gloves | 0 | 55852 | False |
| 127 | hairband | 0 | 51636 | False |
| 45 | panties | 0 | 50543 | False |
| 47 | shorts | 0 | 48838 | False |
| 17372 | sheer_legwear | 0 | 48579 | False |
| 14 | boots | 0 | 47381 | False |
| 189 | bikini_top | 0 | 46492 | False |
| 17370 | black_legwear | 0 | 35307 | False |
| 1602 | sailor_uniform | 0 | 33595 | False |
| 549 | ponytail | 0 | 33572 | False |
| 612 | kneesocks | 0 | 33257 | False |
| 7911 | one-piece_swimsuit | 0 | 32605 | False |
These are the top 16 tags of type `artist (1)`:
| id | tag | type | count | ambiguous |
|------:|:--------------------------------------|-------:|--------:|:------------|
| 5488 | akb48 | 1 | 978 | False |
| 12897 | mimi_girls | 1 | 130 | False |
| 16461 | caroline_charonplop_kyary_pamyu_pamyu | 1 | 86 | False |
| 12892 | ske48 | 1 | 51 | False |
| 14259 | super☆girls | 1 | 34 | False |
| 4226 | kimura_harushi | 1 | 33 | False |
| 17473 | afilia_saga | 1 | 26 | False |
| 10936 | watarirouka_hashiritai | 1 | 24 | False |
| 3335 | perfume_(girl_group) | 1 | 20 | False |
| 14130 | hello_project | 1 | 17 | False |
| 14086 | pink_lady | 1 | 13 | False |
| 2919 | cherry_boom | 1 | 4 | False |
| 15752 | sayura | 1 | 4 | False |
| 20421 | kawahara_kanae | 1 | 1 | False |
| 20420 | nakamura_tomoe | 1 | 1 | False |
| 20464 | tsuki_des | 1 | 1 | False |
These are the top 30 tags of type `copyright (3)`:
| id | tag | type | count | ambiguous |
|------:|:---------------------------------------------|-------:|--------:|:------------|
| 3016 | vocaloid | 3 | 22779 | False |
| 3560 | original | 3 | 22087 | False |
| 411 | touhou | 3 | 15811 | False |
| 13811 | love_live!_school_idol_project | 3 | 13492 | False |
| 3540 | idolmaster | 3 | 11719 | False |
| 3308 | tagme_series | 3 | 8745 | False |
| 4952 | macross | 3 | 8267 | False |
| 3903 | macross_frontier | 3 | 8176 | False |
| 12086 | fate/series | 3 | 7250 | False |
| 237 | neon_genesis_evangelion | 3 | 6937 | False |
| 13112 | idolmaster_cinderella_girls | 3 | 6548 | False |
| 10517 | puella_magi_madoka_magica | 3 | 5720 | False |
| 113 | suzumiya_haruhi_no_yuuutsu | 3 | 5579 | False |
| 17237 | kantai_collection | 3 | 5556 | False |
| 6272 | k-on! | 3 | 4296 | False |
| 10616 | nitro_super_sonic | 3 | 4158 | False |
| 1407 | fate/stay_night | 3 | 3540 | False |
| 7189 | bakemonogatari | 3 | 2723 | False |
| 260 | code_geass | 3 | 2673 | False |
| 977 | ikkitousen | 3 | 2539 | False |
| 215 | dead_or_alive | 3 | 2373 | False |
| 3032 | gundam | 3 | 2164 | False |
| 5405 | project_diva | 3 | 2164 | False |
| 9795 | ore_no_imouto_ga_konna_ni_kawaii_wake_ga_nai | 3 | 2045 | False |
| 6814 | rebuild_of_evangelion | 3 | 1903 | False |
| 7982 | inu_boku_secret_service | 3 | 1891 | False |
| 8607 | bishoujo_senshi_sailor_moon | 3 | 1852 | False |
| 328 | final_fantasy | 3 | 1846 | False |
| 1117 | tengen_toppa_gurren-lagann | 3 | 1824 | False |
| 10498 | boku_wa_tomodachi_ga_sukunai | 3 | 1791 | False |
These are the top 30 tags of type `character (4)`:
| id | tag | type | count | ambiguous |
|------:|:------------------------------|-------:|--------:|:------------|
| 2152 | hatsune_miku | 4 | 13998 | False |
| 3307 | tagme_character | 4 | 9431 | False |
| 4487 | sheryl_nome | 4 | 5086 | False |
| 10615 | super_soniko | 4 | 4060 | False |
| 210 | soryu_asuka_langley | 4 | 3946 | False |
| 7 | suzumiya_haruhi | 4 | 3324 | False |
| 3638 | izayoi_sakuya | 4 | 3114 | False |
| 2925 | kagamine_rin | 4 | 3068 | False |
| 3902 | ranka_lee | 4 | 2852 | False |
| 15643 | minami_kotori | 4 | 2646 | False |
| 336 | saber | 4 | 2367 | False |
| 5725 | megurine_luka | 4 | 2363 | False |
| 3644 | hakurei_reimu | 4 | 2204 | False |
| 14594 | nishikino_maki | 4 | 2112 | False |
| 13971 | yazawa_niko | 4 | 2080 | False |
| 10636 | kaname_madoka | 4 | 1929 | False |
| 17236 | shimakaze_(kantai_collection) | 4 | 1846 | False |
| 209 | ayanami_rei | 4 | 1816 | False |
| 10747 | akemi_homura | 4 | 1721 | False |
| 3247 | yoko_ritona | 4 | 1553 | False |
| 724 | meiko | 4 | 1471 | False |
| 6401 | nakano_azusa | 4 | 1471 | False |
| 3037 | kasumi_(doa) | 4 | 1439 | False |
| 10383 | gokou_ruri | 4 | 1429 | False |
| 6243 | akiyama_mio | 4 | 1384 | False |
| 9702 | patchouli_knowledge | 4 | 1383 | False |
| 6811 | makinami_mari_illustrious | 4 | 1372 | False |
| 6838 | kochiya_sanae | 4 | 1344 | False |
| 261 | cc | 4 | 1276 | False |
| 7983 | shirakiin_ririchiyo | 4 | 1267 | False |
These are the top 30 tags of type `model (5)`:
| id | tag | type | count | ambiguous |
|------:|:----------------|-------:|--------:|:------------|
| 938 | saku | 5 | 14847 | False |
| 6837 | lenfried | 5 | 7530 | False |
| 3417 | necoco | 5 | 5900 | False |
| 89 | chocoball | 5 | 5884 | False |
| 9758 | higurashi_rin | 5 | 4998 | False |
| 13525 | arai_yomi | 5 | 3905 | False |
| 8276 | itsuki_akira | 5 | 3333 | False |
| 8147 | hanamura_misaki | 5 | 3154 | False |
| 1995 | hoshino_aki | 5 | 3150 | False |
| 3710 | hara_mikie | 5 | 2964 | False |
| 832 | non | 5 | 2838 | False |
| 7560 | hamada_yuri | 5 | 2780 | False |
| 6017 | mike | 5 | 2621 | False |
| 11845 | yamanaka_tomoe | 5 | 2567 | False |
| 4497 | mashiro_yuki | 5 | 2461 | False |
| 9769 | kurasaka_kururu | 5 | 2401 | False |
| 4490 | sugihara_anri | 5 | 2232 | False |
| 11673 | enako | 5 | 2189 | False |
| 1976 | kawamura_yukie | 5 | 2150 | False |
| 11416 | yamanaka_mayumi | 5 | 2148 | False |
| 4192 | haruka | 5 | 2129 | False |
| 4841 | himemiya_mahore | 5 | 2126 | False |
| 1019 | sugimoto_yumi | 5 | 2126 | False |
| 1872 | akiyama_rina | 5 | 2100 | False |
| 11276 | amatsuka_miyu | 5 | 2091 | False |
| 10924 | kamui_arisa | 5 | 2077 | False |
| 3245 | tani_momoko | 5 | 2068 | False |
| 233 | kipi | 5 | 2001 | False |
| 2018 | yamamoto_azusa | 5 | 1942 | False |
| 2555 | kiguchi_aya | 5 | 1915 | False |
These are the top 30 tags of type `photo_set (6)`:
| id | tag | type | count | ambiguous |
|------:|:------------------------------|-------:|--------:|:------------|
| 8183 | girlz_high | 6 | 10613 | False |
| 11886 | deep_rabbit | 6 | 1167 | False |
| 16266 | happa_kyoukan_to_pantsu_meido | 6 | 623 | False |
| 14265 | double_journalism | 6 | 611 | False |
| 17356 | love_a_live | 6 | 578 | False |
| 13469 | rinrin_ga_ippa_korekushon_2 | 6 | 563 | False |
| 11817 | crystal_crown | 6 | 546 | False |
| 18782 | scene_ever_4 | 6 | 528 | False |
| 16945 | busujima_saeko_no_eroi_toko | 6 | 490 | False |
| 8884 | lenfriedom!_type_d | 6 | 484 | False |
| 12467 | red_moon | 6 | 456 | False |
| 9618 | going_my_yui_style | 6 | 450 | False |
| 10423 | the_kei_m@ster | 6 | 449 | False |
| 14454 | uzuki_generation | 6 | 443 | False |
| 11893 | seifuku_zukan_4 | 6 | 439 | False |
| 8383 | midnight_karen_2 | 6 | 431 | False |
| 10925 | alice_mixs | 6 | 423 | False |
| 19557 | senku | 6 | 423 | False |
| 2988 | beauty_leg | 6 | 419 | False |
| 18477 | scene_ever_3 | 6 | 409 | False |
| 13680 | touhou_kinbauchou | 6 | 405 | False |
| 12622 | ninjin | 6 | 404 | False |
| 17210 | eastern_fran | 6 | 403 | False |
| 19135 | kuubo- | 6 | 401 | False |
| 17892 | mofumofu_technique_2 | 6 | 394 | False |
| 11749 | lenfriedom!_type_jk | 6 | 391 | False |
| 7229 | kaoru's_collection | 6 | 386 | False |
| 19630 | shimapan_and_stocking_necosmo | 6 | 383 | False |
| 20237 | future_pop!! | 6 | 381 | False |
| 18897 | m_vol.1 | 6 | 381 | False |
|
DeepFoldProtein/Contrastive_Test_Processed | ---
dataset_info:
features:
- name: query_accession
sequence: string
- name: excludes
sequence:
sequence: string
- name: query_sequence
sequence: string
- name: target_accessions
sequence:
sequence: string
- name: target_sequences
sequence:
sequence: string
- name: input_ids
sequence:
sequence:
sequence: int64
- name: attention_mask
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 7383655
num_examples: 8
download_size: 2115701
dataset_size: 7383655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jijay/instructpix2pix-demo | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 4850304.0
num_examples: 4
download_size: 0
dataset_size: 4850304.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instructpix2pix-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McAldo/platypus_embeddings | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 30738175.902270723
num_examples: 24895
download_size: 15478591
dataset_size: 30738175.902270723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
odunola/french-english-unprocessed | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: english_transcript
dtype: string
splits:
- name: train
num_bytes: 1706450505.708
num_examples: 11386
download_size: 1532120167
dataset_size: 1706450505.708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_115 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1470502752.0
num_examples: 286536
download_size: 1496551849
dataset_size: 1470502752.0
---
# Dataset Card for "chunk_115"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamestalentium/dialogsum_250_rm | ---
dataset_info:
features:
- name: id
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 229527.04654895666
num_examples: 250
download_size: 133097
dataset_size: 229527.04654895666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dialogsum_250_rm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
olatte/AI_INET_Assignment3 | ---
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- object-detection
---
# Dataset Card for NYU AI INET ASSIGNMENT 3
## Dataset Summary
youtube video frames for experimentation for AI INET Assignmnet 3 |
kor_qpair | ---
annotations_creators:
- expert-generated
language_creators:
- other
language:
- ko
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- semantic-similarity-classification
pretty_name: KorQpair
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: is_duplicate
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 515365
num_examples: 6136
- name: test
num_bytes: 63466
num_examples: 758
- name: validation
num_bytes: 57242
num_examples: 682
download_size: 545236
dataset_size: 636073
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/songys/Question_pair)
- **Repository:** [Github](https://github.com/songys/Question_pair)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
Each row in the dataset contains two questions and a `is_duplicate` label.
- `question1`: The first question
- `question2`: The second question
- `is_duplicate`: 0 if `question1` and `question2` are semantically similar; 1 otherwise
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@jaketae](https://github.com/jaketae) for adding this dataset. |
CyberHarem/futaba_tsukushi_bangdream | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of futaba_tsukushi (BanG Dream!)
This is the dataset of futaba_tsukushi (BanG Dream!), containing 130 images and their tags.
The core tags of this character are `long_hair, bangs, black_hair, twintails, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 130 | 180.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_tsukushi_bangdream/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 130 | 99.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_tsukushi_bangdream/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 314 | 208.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_tsukushi_bangdream/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 130 | 157.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_tsukushi_bangdream/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 314 | 304.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_tsukushi_bangdream/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/futaba_tsukushi_bangdream',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | looking_at_viewer, black_gloves, earrings, hat, blush, 1girl, half_gloves, skirt, solo, white_headwear, long_sleeves, black_ribbon, smile, open_mouth, red_eyes, low_twintails, neck_ribbon, white_background, white_jacket, cowboy_shot, holding |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, polka_dot_dress, smile, solo, yellow_dress, belt, hair_ornament, long_sleeves, simple_background, blush, closed_mouth, crossed_arms, floating_hair, hair_bow, standing, white_background, brown_footwear, choker, collarbone, frilled_dress, frilled_socks, full_body, medium_dress, necklace, polka_dot_bow, red_footwear, shiny_hair, white_socks |
| 2 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, long_sleeves, pleated_skirt, white_sailor_collar, smile, blue_skirt, collarbone, simple_background, white_background, blue_shirt, low_twintails, upper_body, white_neckerchief, blue_serafuku, breasts, closed_mouth, open_mouth, purple_hair |
| 3 | 7 |  |  |  |  |  | 1girl, blush, small_breasts, solo, looking_at_viewer, open_mouth, collarbone, simple_background, white_background, bikini, cleavage, navel, purple_hair, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | black_gloves | earrings | hat | blush | 1girl | half_gloves | skirt | solo | white_headwear | long_sleeves | black_ribbon | smile | open_mouth | red_eyes | low_twintails | neck_ribbon | white_background | white_jacket | cowboy_shot | holding | polka_dot_dress | yellow_dress | belt | hair_ornament | simple_background | closed_mouth | crossed_arms | floating_hair | hair_bow | standing | brown_footwear | choker | collarbone | frilled_dress | frilled_socks | full_body | medium_dress | necklace | polka_dot_bow | red_footwear | shiny_hair | white_socks | pleated_skirt | white_sailor_collar | blue_skirt | blue_shirt | upper_body | white_neckerchief | blue_serafuku | breasts | purple_hair | small_breasts | bikini | cleavage | navel | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:---------------|:-----------|:------|:--------|:--------|:--------------|:--------|:-------|:-----------------|:---------------|:---------------|:--------|:-------------|:-----------|:----------------|:--------------|:-------------------|:---------------|:--------------|:----------|:------------------|:---------------|:-------|:----------------|:--------------------|:---------------|:---------------|:----------------|:-----------|:-----------|:-----------------|:---------|:-------------|:----------------|:----------------|:------------|:---------------|:-----------|:----------------|:---------------|:-------------|:--------------|:----------------|:----------------------|:-------------|:-------------|:-------------|:--------------------|:----------------|:----------|:--------------|:----------------|:---------|:-----------|:--------|:----------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | X | X | | | X | | X | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | | | X | X | | | X | | X | | X | X | | X | | X | | | | | | | | X | X | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | X | X | | | X | | | | | X | | | | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_wnli_if_would | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 332
num_examples: 1
- name: test
num_bytes: 1239
num_examples: 5
- name: train
num_bytes: 932
num_examples: 3
download_size: 11766
dataset_size: 2503
---
# Dataset Card for "MULTI_VALUE_wnli_if_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsh3729/sw | ---
dataset_info:
features:
- name: filename
dtype: string
- name: tif
dtype: binary
- name: tfw
dtype: binary
splits:
- name: train
num_bytes: 48361
num_examples: 4
download_size: 49914
dataset_size: 48361
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LLukas22/nq-simplified | ---
license: cc-by-sa-3.0
task_categories:
- question-answering
- sentence-similarity
- feature-extraction
language:
- en
---
# Dataset Card for "nq"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** [https://ai.google.com/research/NaturalQuestions](https://ai.google.com/research/NaturalQuestions)
### Dataset Summary
This is a modified version of the original Natural Questions (nq) dataset for qa tasks. The original is availabe [here](https://ai.google.com/research/NaturalQuestions).
Each sample was preprocessed into a squadlike format. The context was shortened from an entire wikipedia article into the passage containing the answer.
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```json
{
"context": "The 2017 Major League Baseball All - Star Game was the 88th edition of the Major League Baseball All Star Game. The game was",
"question": "where is the 2017 baseball all-star game being played",
"answers":
{
"text":["Marlins Park"],
"answer_start":[171]
}
}
```
### Data Fields
The data fields are the same among all splits.
- `question`: a `string` feature.
- `context`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
## Additional Information
### Licensing Information
This dataset is distributed under the cc-by-sa-3.0 license. |
jaydeepb-21/REBELCUSTOMDATA | ---
license: other
---
|
Melonie/video_frames_test | ---
license: bigscience-openrail-m
---
|
DataStudio/OCR_redSeal_redo_02 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: Noise_level
dtype: int64
splits:
- name: train
num_bytes: 962292803.875
num_examples: 91977
download_size: 963706057
dataset_size: 962292803.875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OCR_redSeal_redo_02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/k5_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of k5/K5/K5 (Girls' Frontline)
This is the dataset of k5/K5/K5 (Girls' Frontline), containing 82 images and their tags.
The core tags of this character are `blonde_hair, long_hair, bangs, breasts, blue_eyes, braid, medium_breasts, hair_ornament, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 82 | 117.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k5_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 82 | 63.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k5_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 195 | 134.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k5_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 82 | 103.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k5_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 195 | 195.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k5_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/k5_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, solo, white_background, bare_shoulders, choker, collarbone, simple_background, smile, upper_body, black_dress, jewelry, closed_mouth, large_breasts, official_alternate_costume |
| 1 | 7 |  |  |  |  |  | 1girl, cleavage, closed_mouth, collarbone, holding_gun, looking_at_viewer, solo, black_thighhighs, dress, handgun, jacket, simple_background, smile, white_background, blush, full_body, long_sleeves, toeless_legwear, bare_shoulders, black_choker, off_shoulder, trigger_discipline, arm_up, index_finger_raised, open_coat, standing, toes, white_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cleavage | looking_at_viewer | solo | white_background | bare_shoulders | choker | collarbone | simple_background | smile | upper_body | black_dress | jewelry | closed_mouth | large_breasts | official_alternate_costume | holding_gun | black_thighhighs | dress | handgun | jacket | full_body | long_sleeves | toeless_legwear | black_choker | off_shoulder | trigger_discipline | arm_up | index_finger_raised | open_coat | standing | toes | white_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:--------------------|:-------|:-------------------|:-----------------|:---------|:-------------|:--------------------|:--------|:-------------|:--------------|:----------|:---------------|:----------------|:-----------------------------|:--------------|:-------------------|:--------|:----------|:---------|:------------|:---------------|:------------------|:---------------|:---------------|:---------------------|:---------|:----------------------|:------------|:-----------|:-------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
under-tree/sts_traces | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 28555408
num_examples: 15000
- name: val
num_bytes: 5686916
num_examples: 3000
download_size: 11941770
dataset_size: 34242324
---
# Dataset Card for "sts_traces"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ad321/test-tweets | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- crowdsourced
license:
- gpl-3.0
multilinguality:
- monolingual
paperswithcode_id: null
pretty_name: tweeter-dataset-sent-analysis
size_categories:
- 1M<n<10M
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids:
- sentiment-classification
train-eval-index:
- col_mapping:
label: labels
metrics:
- name: Accuracy
type: accuracy
- args:
average: binary
name: F1 binary
type: f1
tweet: text
config: default
splits:
train_split: train
validation_split: validation
task: text-classification
task_id: binary_classification
---
tweets in english positive negative |
Josephgflowers/OpenOrca-Step-by-step-reasoning | ---
license: mit
---
This work was performed to help models with reasoning. I developed it working on my Cinder model, a STEM q and a model.
Modified OpenORCA Step-by-Step Reasoning Dataset Overview
The Modified OpenORCA Step-by-Step Reasoning Dataset represents a groundbreaking resource in the field of artificial intelligence, specifically designed to enhance the reasoning capabilities of AI models. This unique dataset is the result of a meticulous process of sorting, selecting, and altering dialogues from the original OpenORCA collection, with a focus on promoting an intrinsic approach to step-by-step logical reasoning across a wide array of topics.
Dataset Composition
Derived from the comprehensive OpenORCA dataset, this manually modified version strategically removes sections of prompts for step-by-step reasoning. Instead, it presents AI models with real-world scenarios requiring the deduction of logical steps to reach conclusions without explicit prompting to do so. Thereby encouraging models to develop a natural inclination towards systematic problem-solving. The dataset spans various domains, including but not limited to, everyday logical puzzles, basic mathematical problems, and complex scenario-based queries.
Features
Size: 92.4 MB, 64963 rows of dialogues that demonstrate step-by-step reasoning.
Format: Available in JSON facilitating easy integration with common machine learning frameworks and environments.
Content: Each entry includes a user query followed by a system-generated response that embodies step-by-step reasoning, without explicitly stating the requirement for such a process. This setup aims to train AI models to autonomously employ logical progression in their responses.
Use Cases: Ideal for developing AI models geared towards natural language understanding, conversation AI, educational bots, and any application requiring a deep grasp of logical progression and problem-solving skills.
Potential Applications
AI Model Training: Serves as an invaluable tool for training and refining AI models, especially those focused on natural language processing, conversational intelligence, and automated reasoning.
Educational Technology: Offers a rich resource for creating educational bots and tools designed to assist in teaching logical reasoning, critical thinking, and problem-solving strategies.
Research and Development: Provides a robust foundation for academic and commercial research into improving step-by-step reasoning capabilities of AI systems, enhancing their ability to understand and interact with the world in a more human-like manner.
Licensing and Accessibility
This dataset is distributed under the MIT License, allowing for broad use, modification, and distribution, provided that the original license and copyright notices are included. This liberal licensing ensures that the Modified OpenORCA Step-by-Step Reasoning Dataset can be freely utilized by researchers, developers, and educators to advance the field of AI and develop applications that benefit from enhanced reasoning capabilities.
For request, questions, support, or chat about current research, message me on Cinder's discord https://discord.gg/5ebjDrnZ
Or email Cinder-STEM@gmail.com
Original Open Orca dataset: https://huggingface.co/datasets/Open-Orca/OpenOrca
Inspired by the Microsoft unreleased datasets for Phi.
Special thanks to the contributors of the original dataset
Teknium
WingLian/Caseus
Eric Hartford
NanoBit
Pankaj
Winddude
Rohan
http://AlignmentLab.ai:
Autometa
Entropi
AtlasUnified
NeverendingToast
NanoBit
WingLian/Caseus
Also special thanks to TheBloke for supporting the community.
Original Open Orca Citation:
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}},
}
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint= arXiv 2307.09288
}
@software{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
|
irenema/triage_dataset | ---
dataset_info:
features:
- name: pain
dtype: float64
- name: symptoms
dtype: string
splits:
- name: train
num_bytes: 10324
num_examples: 183
download_size: 5324
dataset_size: 10324
---
# Dataset Card for "triage_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1713115443 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26089
num_examples: 61
download_size: 13596
dataset_size: 26089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datasciathlete/corpus4everyone-klue-xsmall3-korean-balance-NER | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 38657483.2
num_examples: 36240
- name: validation
num_bytes: 9664370.8
num_examples: 9060
download_size: 5407329
dataset_size: 48321854.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Thomas-X-Yang/gsm8k-prolog | ---
license: mit
task_categories:
- question-answering
language:
- en
tags:
- code
size_categories:
- 1K<n<10K
source_datasets:
- gsm8k (https://huggingface.co/datasets/gsm8k)
---
# Dataset Card for GSM8K-Prolog
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** https://github.com/yxc-cyber/ProPer
- **Paper:** [Needs More Information]
### Dataset Summary
This is the Prolog annotated version of the [GSM8K math reasoning dataset](https://huggingface.co/datasets/gsm8k).
We used the same dataset splits and questions in GSM8K and prompted GPT-4 to generate the Prolog programs to solve the questions.
We then manually corrected some malfunctioning samples.
### Supported Tasks and Leaderboards
This dataset can be used to train language models to generate Prolog codes in order to solve math questions and evaluate the performance.
### Languages
The instructions and inputs in the dataset are in English. The outputs are in Prolog.
## Dataset Structure
### Data Instances
Each instance contains a string to instruct the model, a string for the grade-school level math question, and a string for a Prolog program which solves the question.
```python
{
"instruction": "Please generate a piece of Prolog code to solve the given math problem.",
"input": "Natalia sold clips to 48 of her friends in April, and then she sold half as many clips in May. How many clips did Natalia sell altogether in April and May?",
"output": ":- use_module(library(clpq)).\n\nsell_clips(natalia, april, 48).\n\nsolve(Total_amount) :-\n sell_clips(natalia, april, April_amount),\n {May_amount = April_amount / 2},\n {Total_amount = May_amount + April_amount}."
}
```
### Data Fields
The GSM8K-Prolog dataset follows the instruction prompt used in [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca).
The data fields are the same among `train` and `test` splits.
- instruction: The task the model should perform. All the instructions in this dataset are the same because the only task here is to use Prolog to solve math questions.
- input: The question to a grade school math problem.
- output: The Prolog program which solves the math question.
### Data Splits
| name |train|validation|
|--------|----:|---------:|
|main | 7473| 1319|
## Additional Information
### Licensing Information
The GSM8K-Prolog dataset is licensed under the [MIT License](https://opensource.org/licenses/MIT).
### Citation Information
[Needs More Information] |
Producoes/josh | ---
license: openrail
---
|
dongyoung4091/hh-generated_flan_t5_large_flan_t5_large_zeroshot | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_helpfulness
dtype: float64
- name: zeroshot_specificity
dtype: float64
- name: zeroshot_intent
dtype: float64
- name: zeroshot_factuality
dtype: float64
- name: zeroshot_easy-to-understand
dtype: float64
- name: zeroshot_relevance
dtype: float64
- name: zeroshot_readability
dtype: float64
- name: zeroshot_enough-detail
dtype: float64
- name: 'zeroshot_biased:'
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences
dtype: float64
- name: zeroshot_repetetive
dtype: float64
- name: zeroshot_fail-to-consider-context
dtype: float64
- name: zeroshot_too-long
dtype: float64
splits:
- name: train
num_bytes: 6336357
num_examples: 25600
download_size: 814393
dataset_size: 6336357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_large_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
holyofferings/tunellm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4859
num_examples: 15
download_size: 6473
dataset_size: 4859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
myleslinder/crema-d | ---
license: odbl
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: actor_id
dtype: string
- name: sentence
dtype: string
- name: emotion_intensity
dtype: string
- name: label
dtype:
class_label:
names:
'0': anger
'1': disgust
'2': fear
'3': happy
'4': neutral
'5': sad
splits:
- name: train
num_bytes: 1736578
num_examples: 7442
download_size: 470756748
dataset_size: 1736578
---
# Dataset Card for CREMA-D (Crowd-sourced Emotional Multimodal Actors Dataset)
## Dataset Description
- **Homepage:** <https://github.com/CheyneyComputerScience/CREMA-D>
- **Point of Contact:** <dcooper@wcupa.edu>
### Dataset Summary
CREMA-D is a data set of 7,442 original clips from 91 actors. These clips were from 48 male and 43 female actors between the ages of 20 and 74 coming from a variety of races and ethnicities (African America, Asian, Caucasian, Hispanic, and Unspecified).
Actors spoke from a selection of 12 sentences. The sentences were presented using one of six different emotions (Anger, Disgust, Fear, Happy, Neutral and Sad) and four different emotion levels (Low, Medium, High and Unspecified).
Participants rated the emotion and emotion levels based on the combined audiovisual presentation, the video alone, and the audio alone. Due to the large number of ratings needed, this effort was crowd-sourced and a total of 2443 participants each rated 90 unique clips, 30 audio, 30 visual, and 30 audio-visual. 95% of the clips have more than 7 ratings.
### Languages
English
## Dataset Structure
### Data Instances
```json
{
'path': '.../.cache/huggingface/datasets/downloads/extracted/.../data/AudioWAV/1001_DFA_ANG_XX.wav',
'audio': {
'path': '.../.cache/huggingface/datasets/downloads/extracted/.../data/AudioWAV/1001_DFA_ANG_XX.wav',
'array': array([
-1.35336370e-06,
-1.84488497e-04,
-2.73496640e-04,
1.40174336e-04,
8.33026352e-05,
0.00000000e+00
]),
'sampling_rate': 16000
},
'actor_id': '1001',
'sentence': "Don't forget a jacket",
'emotion_intensity': 'Unspecified',
'label': 0
}
```
## Additional Information
### Citation Information
```BibTex
@article{cao2014crema,
title={CREMA-D: Crowd-sourced Emotional Multimodal Actors Dataset},
author={Cao, H. and Cooper, D. G. and Keutmann, M. K. and Gur, R. C. and Nenkova, A. and Verma, R.},
journal={IEEE transactions on affective computing},
volume={5},
number={4},
pages={377--390},
year={2014},
doi={10.1109/TAFFC.2014.2336244},
url={https://doi.org/10.1109/TAFFC.2014.2336244}
}
```
|
AdapterOcean/gorilla_16k_standardized_cluster_4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 38528069
num_examples: 4128
download_size: 10236984
dataset_size: 38528069
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gorilla_16k_standardized_cluster_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
viruPVS/langchain-docs | ---
license: mit
---
|
helliun/happychat-dataset-halfsplit | ---
dataset_info:
features:
- name: input
sequence: string
- name: output
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7323889
num_examples: 5058
download_size: 3993763
dataset_size: 7323889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "happychat-dataset-halfsplit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLP-Lemma/SFT-arxiv | ---
dataset_info:
features:
- name: context
dtype: string
- name: summary
dtype: string
- name: sentences
sequence: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 14196998433
num_examples: 203037
- name: validation
num_bytes: 435239658
num_examples: 6436
- name: test
num_bytes: 436024954
num_examples: 6440
download_size: 7276254120
dataset_size: 15068263045
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
srvs/training | ---
license: artistic-2.0
---
|
CyberHarem/yukari_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yukari/勘解由小路ユカリ/缘里 (Blue Archive)
This is the dataset of yukari/勘解由小路ユカリ/缘里 (Blue Archive), containing 244 images and their tags.
The core tags of this character are `long_hair, purple_hair, halo, purple_halo, hair_ornament, hair_flower, purple_eyes, eyeshadow, red_eyeshadow, breasts, fang, skin_fang, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 244 | 472.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukari_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 244 | 380.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukari_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 633 | 780.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukari_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yukari_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, haori, makeup, open_mouth, pink_flower, red_neckerchief, serafuku, smile, solo, upper_body, white_sailor_collar, looking_at_viewer, simple_background, white_background, blush, long_sleeves |
| 1 | 20 |  |  |  |  |  | 1girl, blush, haori, looking_at_viewer, pink_flower, red_neckerchief, serafuku, solo, makeup, open_mouth, simple_background, smile, pleated_skirt, white_sailor_collar, white_background, long_sleeves, wide_sleeves, blue_skirt, black_skirt |
| 2 | 8 |  |  |  |  |  | 1girl, blush, haori, holding_gun, looking_at_viewer, makeup, open_mouth, pink_flower, red_neckerchief, serafuku, simple_background, smile, solo, white_sailor_collar, white_background, long_sleeves, nail_polish, pleated_skirt, parted_bangs, black_skirt, blue_skirt, fingernails, petals, upper_body, wide_sleeves |
| 3 | 6 |  |  |  |  |  | 1girl, haori, looking_at_viewer, open_mouth, pleated_skirt, red_neckerchief, serafuku, smile, very_long_hair, white_sailor_collar, white_socks, zouri, blue_skirt, blush, full_body, long_sleeves, makeup, pink_flower, sandals, solo, wide_sleeves, simple_background, standing, white_background, parted_bangs, tabi |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, sweat, completely_nude, open_mouth, parted_bangs, smile, breasts_squeezed_together, heart, looking_at_viewer, paizuri, penis, collarbone, cum_on_breasts, makeup, mosaic_censoring, pink_flower, pov, sex |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | haori | makeup | open_mouth | pink_flower | red_neckerchief | serafuku | smile | solo | upper_body | white_sailor_collar | looking_at_viewer | simple_background | white_background | blush | long_sleeves | pleated_skirt | wide_sleeves | blue_skirt | black_skirt | holding_gun | nail_polish | parted_bangs | fingernails | petals | very_long_hair | white_socks | zouri | full_body | sandals | standing | tabi | 1boy | hetero | nipples | solo_focus | sweat | completely_nude | breasts_squeezed_together | heart | paizuri | penis | collarbone | cum_on_breasts | mosaic_censoring | pov | sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:-------------|:--------------|:------------------|:-----------|:--------|:-------|:-------------|:----------------------|:--------------------|:--------------------|:-------------------|:--------|:---------------|:----------------|:---------------|:-------------|:--------------|:--------------|:--------------|:---------------|:--------------|:---------|:-----------------|:--------------|:--------|:------------|:----------|:-----------|:-------|:-------|:---------|:----------|:-------------|:--------|:------------------|:----------------------------|:--------|:----------|:--------|:-------------|:-----------------|:-------------------|:------|:------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | | | X | | | | X | | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
openworld-domains/conceptnet-full-en-essentials | ---
language:
- en
tags:
- knowledge-graph
- english
- relations
- conceptnet
pretty_name: ' Conceptnet Full EN [essentials]'
---
# Conceptnet Full EN (essentials)
## Dataset Summary:
This dataset is a compact and simplified version of ConceptNet, emphasizing English concepts and their sources. It retains the essential information about the relations in a format that is straightforward and user-friendly. Designed for efficiency and ease of use, this dataset is particularly suitable for scenarios with computational constraints. While the original ConceptNet database exceeds 20GB in size, this streamlined version is a mere 500MB, making it feasible to run on less powerful computers or servers without compromising the richness of English-centric knowledge.
## Motivation:
The genesis of this dataset lies in a comprehensive study aimed at enhancing the ability of AI systems to disambiguate English concepts. In the vast expanse of language, terms frequently adopt varied meanings depending on context, making it a daunting task for individuals to encompass all possible interpretations of a single term. ConceptNet, with its rich tapestry of relationships between concepts and their origins, provides a distinctive perspective to navigate and comprehend these layered meanings. By condensing this extensive knowledge repository into a more concise and accessible format, our analyses became not only swifter but also more resource-efficient. Recognizing the potential value to the broader community, we felt compelled to share this dataset openly.
## Dataset Structure
Before diving into the specific fields, it's essential to understand that each entry in this dataset captures a relationship. This relationship can be between two English concepts or between an English concept and its external source (URL). These relationships are detailed through a set of fields that provide both textual and URI-based identifiers, along with a weight that quantifies the relationship's strength or significance.
### Sample:
```plaintext
start_text: areligious
relation_text: Antonym
end_text: religious
start_uri: /c/en/areligious/a
relation_uri: /r/Antonym
end_uri: /c/en/religious
weight: 1
```
### Fields:
- **start_text**: The textual representation of the starting concept in the relationship.
- **relation_text**: The textual description of the type of relationship, whether it's between two concepts or a concept and its source.
- **end_text**: The textual representation of the ending concept or the external URL in the relationship.
- **surfaceText**: A human-readable sentence or phrase that illustrates the relationship between the `start_text` and `end_text` concepts.
- **start_uri**: The unique URI identifier for the starting concept.
- **relation_uri**: The unique URI identifier for the type of relationship. This URI can be used to understand the nature of the relationship. For a comprehensive list of relation types and their meanings, refer to [ConceptNet's Relation Types Documentation](https://github.com/commonsense/conceptnet5/wiki/Relations).
- **end_uri**: The unique URI identifier for the ending concept or the external URL.
- **weight**: A numerical value representing the strength or significance of the relationship. The weight is derived from the number of sources that support the relationship and the reliability of these sources. A higher weight indicates a stronger consensus or more robust evidence for the relationship in the knowledge base.
**Note**: The dataset contains multiple self-looping relations. These were present in the original ConceptNet data, and the decision was made to retain them for this compact version.
## Licensing and Attribution
This dataset is released under the Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0). Users are free to use, modify, and distribute the dataset, provided they give appropriate credit to the original source and indicate if changes were made. For proper attribution, please reference the original ConceptNet data and this compact version when utilizing it in your work.
## Acknowledgments
A heartfelt thank you goes out to the ConceptNet team for their invaluable work in creating a comprehensive knowledge base that serves as the foundation for this dataset.
For academic purposes, we recommend citing the original ConceptNet work as follows:
```
Robyn Speer et al. ConceptNet 5. Available at: [ConceptNet GitHub Repository](https://github.com/commonsense/conceptnet5).
```
---
🌐 **Discover More About Our Journey**: If you're intrigued by what you see and want to dive deeper into our work, feel free to explore [Open World Domains](https://openworld.domains). 🌟 |
sunoociel/sunooteste1 | ---
license: openrail
---
|
shemil076/wasteCollectionClassification | ---
license: mit
---
|
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b | ---
pretty_name: Evaluation run of pankajmathur/orca_mini_v3_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7010508529623596,\n\
\ \"acc_stderr\": 0.0309286120388273,\n \"acc_norm\": 0.7049679984523141,\n\
\ \"acc_norm_stderr\": 0.030896356315399304,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\
\ \"acc_stderr\": 0.00459390260197934,\n \"acc_norm\": 0.8785102569209321,\n\
\ \"acc_norm_stderr\": 0.0032602788112468337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026783,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026783\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.01661139368726857,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.01661139368726857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127062,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427653,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet'
- config_name: results
data_files:
- split: 2023_10_09T02_12_05.216705
path:
- results_2023-10-09T02-12-05.216705.parquet
- split: latest
path:
- results_2023-10-09T02-12-05.216705.parquet
---
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7010508529623596,
"acc_stderr": 0.0309286120388273,
"acc_norm": 0.7049679984523141,
"acc_norm_stderr": 0.030896356315399304,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6951802429794861,
"acc_stderr": 0.00459390260197934,
"acc_norm": 0.8785102569209321,
"acc_norm_stderr": 0.0032602788112468337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708052,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026783,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519513,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305733,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305733
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.01661139368726857,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.01661139368726857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.021330868762127062,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.021330868762127062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559973924380704,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.559973924380704,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427653,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arieg/bw_spec_cls_160 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '56248'
'1': '56249'
'2': '56273'
'3': '56274'
'4': '56275'
'5': '56465'
'6': '56466'
'7': '56467'
'8': '56468'
'9': '56469'
'10': '56470'
'11': '56471'
'12': '56472'
'13': '56474'
'14': '56493'
'15': '56495'
'16': '56496'
'17': '56497'
'18': '56498'
'19': '56499'
'20': '56516'
'21': '56517'
'22': '56518'
'23': '56519'
'24': '56520'
'25': '56521'
'26': '56639'
'27': '56640'
'28': '56641'
'29': '56645'
'30': '56646'
'31': '56648'
'32': '56649'
'33': '56650'
'34': '56651'
'35': '56686'
'36': '56687'
'37': '56688'
'38': '56689'
'39': '56690'
'40': '56691'
'41': '56692'
'42': '56693'
'43': '56694'
'44': '56695'
'45': '56696'
'46': '56795'
'47': '56796'
'48': '56797'
'49': '56798'
'50': '56799'
'51': '56800'
'52': '56801'
'53': '56802'
'54': '56803'
'55': '56804'
'56': '56805'
'57': '56888'
'58': '57164'
'59': '57175'
'60': '57176'
'61': '57177'
'62': '57178'
'63': '57179'
'64': '57180'
'65': '57344'
'66': '57360'
'67': '57371'
'68': '57417'
'69': '57418'
'70': '57440'
'71': '57442'
'72': '57500'
'73': '57569'
'74': '57626'
'75': '57627'
'76': '57628'
'77': '57629'
'78': '57630'
'79': '57639'
'80': '57640'
'81': '57648'
'82': '57658'
'83': '57661'
'84': '57662'
'85': '57663'
'86': '57665'
'87': '57691'
'88': '57697'
'89': '57819'
'90': '57820'
'91': '57821'
'92': '57822'
'93': '57823'
'94': '57936'
'95': '57937'
'96': '57938'
'97': '57939'
'98': '57943'
'99': '57968'
'100': '58052'
'101': '58053'
'102': '58054'
'103': '58060'
'104': '58061'
'105': '58063'
'106': '58068'
'107': '58070'
'108': '58115'
'109': '58116'
'110': '58117'
'111': '58135'
'112': '58140'
'113': '58161'
'114': '58162'
'115': '58164'
'116': '58166'
'117': '58169'
'118': '58170'
'119': '58173'
'120': '58174'
'121': '58212'
'122': '58213'
'123': '58215'
'124': '58221'
'125': '58225'
'126': '58341'
'127': '58474'
'128': '59078'
'129': '59373'
'130': '59374'
'131': '59561'
'132': '59653'
'133': '59654'
'134': '59656'
'135': '59657'
'136': '59658'
'137': '59659'
'138': '59660'
'139': '59663'
'140': '59664'
'141': '59666'
'142': '59667'
'143': '59669'
'144': '59671'
'145': '59673'
'146': '59675'
'147': '59676'
'148': '59677'
'149': '59678'
'150': '59679'
'151': '59680'
'152': '59681'
'153': '59682'
'154': '59683'
'155': '59684'
'156': '59685'
'157': '59686'
'158': '59687'
'159': '59688'
splits:
- name: train
num_bytes: 179214128.0
num_examples: 3200
download_size: 179008943
dataset_size: 179214128.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bw_spec_cls_160"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maidalun1020/CrosslingualRetrievalOthersEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 4629552
num_examples: 22959
- name: corpus
num_bytes: 3726790
num_examples: 3919
download_size: 6216012
dataset_size: 8356342
---
|
iix/Parquet_FIles | ---
license: mit
task_categories:
- text-classification
- text-generation
language:
- en
tags:
- code
pretty_name: '*'
size_categories:
- 1M<n<10M
---
# Parquet_Files
# Cross Language (CL) Datasets
Four datasets of language pair translations originating from CORDIS Project News (https://elrc-share.eu/)
```
Structured as follows:
| Field | Description |
| --------------- | ----------------------------------------------------------------------- |
| de/es/fr/it | Non-English transcripts of sentences |
| en | English translations of sentences |
120k rows (approx. per parquet file)
```
# Characters.parquet
One large dataset of synthetic characters, information regarding them and their outputs. Created by combining other synthetic character datasets which can be found at: https://shorturl.at/gjIJ5, https://shorturl.at/DFQT6 & https://shorturl.at/tHX68.
```
Structured as follows:
| Field | Description |
| -------------- | --------------------------------------------------------------- |
| name | Character name |
| categories | Categories or genres associated with the character (array) |
| personalities | Personality traits and characteristics of the character (array) |
| description | Description of the character |
| conversation | Conversational interactions involving the character (array) |
17.6k rows
```
# Flores7Lang.parquet
Seven languages merged from the Flores-200 Dataset (https://github.com/facebookresearch/flores/blob/main/flores200/README.md)
```
Structured as follows:
| Field | Description |
| --------------- | ----------------------------------------------------- |
| deu | Sentence translated into German |
| eng | Sentence translated into English |
| epo | Sentence translated into Esperanto |
| fra | Sentence translated into French |
| ita | Sentence translated into Italian |
| spa | Sentence translated into Spanish |
| tur | Sentence translated into Turkish |
2.1k rows
```
# Fraud_detection.parquet
A synthetic financial dataset intended for Fraud detection tasks (https://www.kaggle.com/datasets/ealaxi/paysim1).
```
Structured as follows:
| Field | Description |
| --------------- | ------------------------------------------------------------- |
| Step | Maps a unit of time in the real world (1 step = 1 hour) |
| Type | CASH-IN, CASH-OUT, DEBIT, PAYMENT, or TRANSFER |
| Amount | Amount of the transaction in local currency |
| nameOrig | Customer who initiated the transaction |
| oldbalanceOrg | Initial balance before the transaction |
| newbalanceOrig | Customer's balance after the transaction |
| nameDest | Recipient ID of the transaction |
| oldbalanceDest | Initial recipient balance before the transaction |
| newbalanceDest | Recipient's balance after the transaction |
| isFraud | Identifies a fraudulent transaction (1) or non-fraudulent (0) |
2.27m rows
``` |
ailearningcorner/sample_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: ID
dtype: int64
- name: ' Student'
dtype: string
splits:
- name: train
num_bytes: 128.1
num_examples: 7
- name: test
num_bytes: 54.9
num_examples: 3
download_size: 2655
dataset_size: 183.0
---
# Dataset Card for "sample_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/on-device-face-liveness-detection | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
- image-to-image
- image-classification
language:
- en
---
# Mobile Face Liveness Detection
The dataset consists of videos featuring individuals wearing various types of masks. Videos are recorded under **different lighting conditions** and with **different attributes** (*glasses, masks, hats, hoods, wigs, and mustaches for men*).
In the dataset, there are **4 types of videos** filmed on mobile devices:
- **2D mask with holes for eyes** - demonstration of an attack with a paper/cardboard mask (*mask*)
- **2D mask with holes for eyes, nose, and mouth** - demonstration of an attack with a paper/cardboard mask with cutouts for the nose and mouth (*mask_cut*)
- **2D mask** - demonstration of an attack with a paper/cardboard silhouette (*outline*)
- **Real Video** - demonstration of a real person's face (*real*)

The dataset allows researchers and developers in recognizing and analyzing **facial expressions, anti-spoofing tasks, face detection, re-identification and face recognition tasks**. The inclusion of various attributes and different lighting conditions aims to enhance the **robustness and effectiveness** of anti-spoofing models in real-world scenarios.
## Full version of the dataset includes 7,200+ videos of people, leave a request on **[TrainingData](https://trainingdata.pro/data-market/on-device-face-liveness-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=on-device-face-liveness-detection)** to buy the dataset
### Statistics for the dataset (gender, type of the device, type of the attack):

# Get the Dataset
## This is just an example of the data
Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=on-device-face-liveness-detection) to learn about the price and buy the dataset**
# Content
The folder **files** includes:
- **mask** - includes videos of people wearing 2D mask with holes for eyes,
- **mask_cut** - includes videos of people wearing 2D mask with holes for eyes, nose, and mouth,
- **outline** - includes videos of people wearing 2D mask,
- **real** - includes real videos of people
### File with the extension .csv
- **file**: link to access the file,
- **type**: type of the video (*real, mask, outline, mask_cut*)
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=on-device-face-liveness-detection)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: ibeta level 1, ibeta level 2, liveness detection systems, liveness detection dataset, biometric dataset, biometric data dataset, biometric system attacks, anti-spoofing dataset, face liveness detection, deep learning dataset, face spoofing database, face anti-spoofing, face recognition, face detection, face identification, human video dataset, video dataset, presentation attack detection, presentation attack dataset, 2d print attacks, print 2d attacks dataset, printed 2d masks dataset, spoofing in 2D face recognition, facial masks, 2D face recognition systems, detecting face spoofing attacks, detecting presentation attacks, computer vision, surveillance face anti-spoofing, face liveness detection software solution* |
SevgiY/chatbot-bitirme | ---
dataset_info:
features:
- name: Müşteri
dtype: string
- name: Müşteri servisi
dtype: string
splits:
- name: train
num_bytes: 1657210.4968509448
num_examples: 3600
- name: validation
num_bytes: 184134.49965010496
num_examples: 400
download_size: 549274
dataset_size: 1841344.9965010497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
adambuttrick/50K_deduplicated_ner_indexes_name_country_alpaca_format_json_response | ---
license: cc0-1.0
---
|
pharaouk/math-orca-arch | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 225322861
num_examples: 200035
download_size: 84248748
dataset_size: 225322861
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- question-answering
language:
- en
tags:
- math
size_categories:
- 100K<n<1M
---
# Dataset Card
<!-- Provide a quick summary of the dataset. -->
This dataset contains ~200K grade school math word problems. All the answers in this dataset is generated using Azure GPT4-Turbo. Please refer to [Orca-Math: Unlocking the potential of
SLMs in Grade School Math](https://arxiv.org/pdf/2402.14830.pdf) for details about the dataset construction.
### Dataset Description
- **Curated by:** Microsoft
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [microsoft/orca-math-word-problems-200k](https://huggingface.co/datasets/microsoft/orca-math-word-problems-200k)
- **Paper:** [Orca-Math: Unlocking the potential of
SLMs in Grade School Math](https://arxiv.org/pdf/2402.14830.pdf)
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset has been designed to enhance the mathematical abilities of language models. It aims to provide a robust foundation for language models to excel in mathematical problem-solving.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not intended for use in educational systems or organizations.
## Dataset Structure
### Data Instances
A typical data entry in the dataset consists of a question and its corresponding answer. Below is an example from the dataset:
```python
{'question': 'In a highly contested election having multiple candidates, Mr. Jackson, one of the losing candidates, received 3,485,782 votes, which accounted for precisely 38.7 percent of all votes. To have achieved a victory, he would have needed to secure at least 51 percent of all votes. Approximately, what percent of the remaining unsecured votes would Mr. Jackson have needed to accumulate to reach this victory threshold?',
'answer': "First, let's find out the total number of votes cast in the election. Since Mr. Jackson received 38.7% of all votes, and that amounted to 3,485,782 votes, we can set up the following equation to find the total number of votes (T):\n\n0.387 * T = 3,485,782\n\nNow, solve for T:\n\nT = 3,485,782 / 0.387\nT ≈ 9,000,467 votes (total number of votes cast)\n\nTo win, Mr. Jackson would have needed 51% of the total votes. Let's calculate that amount:\n\n0.51 * T = 0.51 * 9,000,467\n0.51 * T ≈ 4,590,238 votes needed to win\n\nNow, let's find out how many more votes Mr. Jackson needed to reach this winning threshold:\n\nVotes needed to win - Votes Mr. Jackson received = Additional votes needed\n4,590,238 - 3,485,782 = 1,104,456 additional votes needed\n\nNow, let's find out what percentage of the remaining unsecured votes this number represents. The remaining unsecured votes are the votes that were not for Mr. Jackson, which is 100% - 38.7% = 61.3% of the total votes.\n\n61.3% of the total votes is the remaining unsecured votes:\n\n0.613 * T = 0.613 * 9,000,467\n0.613 * T ≈ 5,514,686 votes were unsecured\n\nNow, we'll calculate the percentage of these unsecured votes that the additional votes needed represent:\n\n(Additional votes needed / Unsecured votes) * 100 = Percentage of unsecured votes needed\n(1,104,456 / 5,514,686) * 100 ≈ 20.03%\n\nSo, Mr. Jackson would have needed approximately 20.03% of the remaining unsecured votes to reach the victory threshold of 51%."}
```
### Data Fields
The dataset comprises the following fields:
- `question`: a string containing the question to be answered.
- `answer`: a string containing the answer to the corresponding question.
### Data Splits
The dataset is split into a training set. The number of rows in each split is as follows:
- `train`: 200,035 rows
The `DatasetDict` structure for the dataset is as follows:
```python
DatasetDict({
'train': Dataset({
features: ['question', 'answer'],
num_rows: 200035
})
})
```
Each split in the `DatasetDict` contains a `Dataset` object with the specified features and number of rows.
## Dataset Creation
Please refer to [Orca-Math: Unlocking the potential of
SLMs in Grade School Math](https://arxiv.org/pdf/2402.14830.pdf) for details about the dataset construction.
### Source Data
- [Lila](https://huggingface.co/datasets/allenai/lila)
- [DMath](https://arxiv.org/ftp/arxiv/papers/2106/2106.15772.pdf)
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Please refer to [Orca-Math: Unlocking the potential of
SLMs in Grade School Math](https://arxiv.org/pdf/2402.14830.pdf) for details about the dataset construction.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Microsoft
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
We expanded a seed set of questions using Azure GPT-4 Trubo. The answers to those questions are generated using Azure GPT-4 Trubo.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
None
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
This dataset is in English and contains only math word problems.
## Citation
If you find this work useful in your method, you can cite the paper as below:
```
@misc{mitra2024orcamath,
title={Orca-Math: Unlocking the potential of SLMs in Grade School Math},
author={Arindam Mitra and Hamed Khanpour and Corby Rosset and Ahmed Awadallah},
year={2024},
eprint={2402.14830},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Dataset Card Contact
[Arindam Mitra](armitra@microsoft.com)
|
llm-aes/pandalm-annotated | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input_x
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: annotator
dtype: string
- name: preference
dtype: int64
- name: raw_completion
dtype: string
- name: idx
dtype: int64
- name: annotator1
dtype: int64
- name: annotator2
dtype: int64
- name: annotator3
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 2582344
num_examples: 3223
download_size: 183113
dataset_size: 2582344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pandalm-annotated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adhisetiawan/pothole-segmentation-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6311570.0
num_examples: 79
- name: validation
num_bytes: 337776.0
num_examples: 5
download_size: 6599847
dataset_size: 6649346.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Seanxh/twitter_dataset_1713195113 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 74155
num_examples: 172
download_size: 31264
dataset_size: 74155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zeroshot/text-2-cypher | ---
license: apache-2.0
---
|
fathyshalab/massive_play | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 70622
num_examples: 1377
- name: validation
num_bytes: 12960
num_examples: 260
- name: test
num_bytes: 19816
num_examples: 387
download_size: 0
dataset_size: 103398
---
# Dataset Card for "massive_play"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/oasst1_prompt_en | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 32670635
num_examples: 20976
download_size: 12117771
dataset_size: 32670635
---
# Dataset Card for "oasst1_prompt_dataset_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/godeater | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of God Eater
This is the image base of bangumi GOD EATER, we detected 23 characters, 1589 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 31 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 6 | [Download](1/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 2 | 176 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 22 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 49 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 527 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 32 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 28 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 15 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 17 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 50 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 124 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 26 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 24 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 10 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 9 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 24 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 61 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 121 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 131 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 16 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 6 | [Download](21/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 84 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
autoevaluate/autoeval-staging-eval-samsum-samsum-41c5cd-15606152 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
metrics: ['mae']
dataset_name: samsum
dataset_config: samsum
dataset_split: validation
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tuned_for_sum
* Dataset: samsum
* Config: samsum
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@SamuelAllen123](https://huggingface.co/SamuelAllen123) for evaluating this model. |
enrdur/monero_xmr_question_answer | ---
language:
- en
license: wtfpl
pretty_name: XMR questions & answers
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 433789
num_examples: 842
download_size: 227429
dataset_size: 433789
tags:
- finance
---
# Monero (XMR) Q&A Dataset
## Overview
The Monero (XMR) Q&A Dataset is a meticulously curated compilation of questions and answers focused on the Monero cryptocurrency. This dataset is designed to serve as a resource for machine learning practitioners, data scientists, cryptocurrency enthusiasts, and researchers aiming to build models that can understand, interact with, or analyze the Monero ecosystem.
## Features
- **Comprehensive Coverage**: The dataset covers a wide array of topics, ranging from basic concepts like "What is Monero?" to more complex subjects such as ring signatures, stealth addresses, and privacy mechanisms.
- **Quality Assurance**: Each entry has undergone thorough validation to ensure factual accuracy and relevance to the evolving landscape of Monero.
- **Machine Learning Ready**: Formatted to be readily used in a variety of machine learning models, including NLP algorithms for chatbots.
## Applications
- **Chatbots**: Enhance the conversational capabilities of bots focused on cryptocurrency topics.
## Format
The dataset is structured as pairs of questions and answers, you will need to process further in case your model is expecting a particular format. |
DominusTea/GreekLegalSum | ---
license: cc-by-nc-4.0
task_categories:
- summarization
language:
- el
pretty_name: Greek Court Summarization Dataset
size_categories:
- 100M<n<1B
--- |
hamel/sample-OpenOrca | ---
license: mit
---
|
distilled-from-one-sec-cv12/chunk_63 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1119612516
num_examples: 218163
download_size: 1135359882
dataset_size: 1119612516
---
# Dataset Card for "chunk_63"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/augmentatio-standardized_cluster_4_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13964071
num_examples: 6698
download_size: 6329527
dataset_size: 13964071
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alikanakar/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 13484975.0
num_examples: 20
download_size: 0
dataset_size: 13484975.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ademax/extract_metadata_ademax_contract | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 5893620
num_examples: 2574
download_size: 282155
dataset_size: 5893620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "extract_metadata_ademax_contract"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
centroIA/MistralInstruct | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2682613
num_examples: 967
download_size: 694943
dataset_size: 2682613
---
# Dataset Card for "MistralInstruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangjinlong/dzqge | ---
license: mit
---
|
Treza12/BreastCancer | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat | ---
pretty_name: Evaluation run of h2oai/h2o-danube2-1.8b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2o-danube2-1.8b-chat](https://huggingface.co/h2oai/h2o-danube2-1.8b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T14:38:42.310290](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat/blob/main/results_2024-04-05T14-38-42.310290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3868399519571543,\n\
\ \"acc_stderr\": 0.03387509761212604,\n \"acc_norm\": 0.3867414836209043,\n\
\ \"acc_norm_stderr\": 0.03457957557740848,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476189,\n \"mc2\": 0.4053932180068248,\n\
\ \"mc2_stderr\": 0.014576274541453002\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.01439707056440917,\n\
\ \"acc_norm\": 0.43686006825938567,\n \"acc_norm_stderr\": 0.01449442158425652\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5461063533160725,\n\
\ \"acc_stderr\": 0.004968521608065464,\n \"acc_norm\": 0.7389962158932484,\n\
\ \"acc_norm_stderr\": 0.004382844128643413\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.03115852213135778,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.03115852213135778\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3903225806451613,\n\
\ \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.3903225806451613,\n\
\ \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.0389853160557942,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.0389853160557942\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.5025906735751295,\n \"acc_stderr\": 0.03608390745384488,\n\
\ \"acc_norm\": 0.5025906735751295,\n \"acc_norm_stderr\": 0.03608390745384488\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43486238532110094,\n \"acc_stderr\": 0.021254631465609273,\n \"\
acc_norm\": 0.43486238532110094,\n \"acc_norm_stderr\": 0.021254631465609273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690225,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690225\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46568627450980393,\n \"acc_stderr\": 0.035010383276358976,\n \"\
acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.035010383276358976\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5316455696202531,\n \"acc_stderr\": 0.032481974005110756,\n \
\ \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.032481974005110756\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.033141902221106585,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.033141902221106585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5041322314049587,\n \"acc_stderr\": 0.045641987674327526,\n \"\
acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.045641987674327526\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.03881891213334383,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.03881891213334383\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641087,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641087\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.48020434227330777,\n\
\ \"acc_stderr\": 0.017865944827291626,\n \"acc_norm\": 0.48020434227330777,\n\
\ \"acc_norm_stderr\": 0.017865944827291626\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.026720034380514998,\n\
\ \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.026720034380514998\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n\
\ \"acc_stderr\": 0.014835616582882625,\n \"acc_norm\": 0.26927374301675977,\n\
\ \"acc_norm_stderr\": 0.014835616582882625\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.47266881028938906,\n\
\ \"acc_stderr\": 0.028355633568328188,\n \"acc_norm\": 0.47266881028938906,\n\
\ \"acc_norm_stderr\": 0.028355633568328188\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.027684721415656192,\n\
\ \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.027684721415656192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.333116036505867,\n\
\ \"acc_stderr\": 0.012037930451512052,\n \"acc_norm\": 0.333116036505867,\n\
\ \"acc_norm_stderr\": 0.012037930451512052\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3758169934640523,\n \"acc_stderr\": 0.019594021136577454,\n \
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.019594021136577454\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330432,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330432\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.44776119402985076,\n\
\ \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.44776119402985076,\n\
\ \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.036965843170106004,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.036965843170106004\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5321637426900585,\n \"acc_stderr\": 0.038268824176603704,\n\
\ \"acc_norm\": 0.5321637426900585,\n \"acc_norm_stderr\": 0.038268824176603704\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476189,\n \"mc2\": 0.4053932180068248,\n\
\ \"mc2_stderr\": 0.014576274541453002\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6890292028413575,\n \"acc_stderr\": 0.01300953473628607\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3055344958301744,\n \
\ \"acc_stderr\": 0.012688134076726875\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2o-danube2-1.8b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-38-42.310290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-38-42.310290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- '**/details_harness|winogrande|5_2024-04-05T14-38-42.310290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T14-38-42.310290.parquet'
- config_name: results
data_files:
- split: 2024_04_05T14_38_42.310290
path:
- results_2024-04-05T14-38-42.310290.parquet
- split: latest
path:
- results_2024-04-05T14-38-42.310290.parquet
---
# Dataset Card for Evaluation run of h2oai/h2o-danube2-1.8b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h2oai/h2o-danube2-1.8b-chat](https://huggingface.co/h2oai/h2o-danube2-1.8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T14:38:42.310290](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat/blob/main/results_2024-04-05T14-38-42.310290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3868399519571543,
"acc_stderr": 0.03387509761212604,
"acc_norm": 0.3867414836209043,
"acc_norm_stderr": 0.03457957557740848,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476189,
"mc2": 0.4053932180068248,
"mc2_stderr": 0.014576274541453002
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.01439707056440917,
"acc_norm": 0.43686006825938567,
"acc_norm_stderr": 0.01449442158425652
},
"harness|hellaswag|10": {
"acc": 0.5461063533160725,
"acc_stderr": 0.004968521608065464,
"acc_norm": 0.7389962158932484,
"acc_norm_stderr": 0.004382844128643413
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.03115852213135778,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.03115852213135778
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3903225806451613,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.3903225806451613,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.0389853160557942,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.0389853160557942
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5025906735751295,
"acc_stderr": 0.03608390745384488,
"acc_norm": 0.5025906735751295,
"acc_norm_stderr": 0.03608390745384488
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43486238532110094,
"acc_stderr": 0.021254631465609273,
"acc_norm": 0.43486238532110094,
"acc_norm_stderr": 0.021254631465609273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690225,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690225
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.035010383276358976,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.035010383276358976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.032481974005110756,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.032481974005110756
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.033141902221106585,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.033141902221106585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.045641987674327526,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.045641987674327526
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.03881891213334383,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.03881891213334383
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641087,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641087
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.48020434227330777,
"acc_stderr": 0.017865944827291626,
"acc_norm": 0.48020434227330777,
"acc_norm_stderr": 0.017865944827291626
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.026720034380514998,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.026720034380514998
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.014835616582882625,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.014835616582882625
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.47266881028938906,
"acc_stderr": 0.028355633568328188,
"acc_norm": 0.47266881028938906,
"acc_norm_stderr": 0.028355633568328188
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4506172839506173,
"acc_stderr": 0.027684721415656192,
"acc_norm": 0.4506172839506173,
"acc_norm_stderr": 0.027684721415656192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.333116036505867,
"acc_stderr": 0.012037930451512052,
"acc_norm": 0.333116036505867,
"acc_norm_stderr": 0.012037930451512052
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.019594021136577454,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.019594021136577454
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330432,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330432
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.44776119402985076,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.44776119402985076,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.036965843170106004,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.036965843170106004
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5321637426900585,
"acc_stderr": 0.038268824176603704,
"acc_norm": 0.5321637426900585,
"acc_norm_stderr": 0.038268824176603704
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476189,
"mc2": 0.4053932180068248,
"mc2_stderr": 0.014576274541453002
},
"harness|winogrande|5": {
"acc": 0.6890292028413575,
"acc_stderr": 0.01300953473628607
},
"harness|gsm8k|5": {
"acc": 0.3055344958301744,
"acc_stderr": 0.012688134076726875
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Corotinho/aaaa | ---
license: openrail
---
|
nbroad/fewnerd-organizations | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ORG
'2': I-ORG
splits:
- name: train
num_bytes: 50533000
num_examples: 122459
- name: test
num_bytes: 15189310
num_examples: 36738
- name: validation
num_bytes: 6485434
num_examples: 15745
download_size: 18458832
dataset_size: 72207744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Ketan8010/AI | ---
dataset_info:
features:
- name: Template
dtype: string
splits:
- name: train
num_bytes: 72864
num_examples: 503
download_size: 33361
dataset_size: 72864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_more_much | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 7849
num_examples: 51
- name: test
num_bytes: 16995
num_examples: 110
- name: train
num_bytes: 238053
num_examples: 2043
download_size: 133672
dataset_size: 262897
---
# Dataset Card for "MULTI_VALUE_sst2_more_much"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/airo-bad | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1897056
num_examples: 1000
download_size: 1088702
dataset_size: 1897056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skrishna/coin_flip_transformed | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 1396936
num_examples: 2000
download_size: 334544
dataset_size: 1396936
---
# Dataset Card for "coin_flip_transformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
renumics/emodb-enriched | ---
size_categories:
- n<1K
dataset_info:
features:
- name: age
dtype: float32
- name: gender
dtype:
class_label:
names:
'0': female
'1': male
- name: emotion
dtype:
class_label:
names:
'0': anger
'1': boredom
'2': disgust
'3': fear
'4': happiness
'5': neutral
'6': sadness
- name: audio
dtype: audio
- name: m1_gender_prediction
dtype:
class_label:
names:
'0': female
'1': male
- name: m2_gender_prediction
dtype:
class_label:
names:
'0': female
'1': male
- name: m1_embedding
sequence: float32
length: 1028
- name: m2_embedding
sequence: float32
length: 1028
- name: emotion_embedding
sequence: float32
length: 1024
- name: m1_correct
dtype:
class_label:
names:
'0': wrong
'1': correct
- name: m2_correct
dtype:
class_label:
names:
'0': wrong
'1': correct
splits:
- name: train
num_bytes: 54231717.0
num_examples: 535
download_size: 56965550
dataset_size: 54231717.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Dataset Name
## Dataset Description
About Dataset
Emo-DB Database
The EMODB database is the freely available German emotional database. The database is created by the Institute of Communication Science, Technical University, Berlin, Germany. Ten professional speakers (five males and five females) participated in data recording. The database contains a total of 535 utterances. The EMODB database comprises of seven emotions: 1) anger; 2) boredom; 3) anxiety; 4) happiness; 5) sadness; 6) disgust; and 7) neutral. The data was recorded at a 48-kHz sampling rate and then down-sampled to 16-kHz.
Additional Information
Original URL: https://www.tu.berlin/en/kw/research/projects/emotional-speech
Every utterance is named according to the same scheme:
Positions 1-2: number of speaker
Positions 3-5: code for text
Position 6: emotion (sorry, letter stands for german emotion word)
Position 7: if there are more than two versions these are numbered a, b, c ....
Example: 03a01Fa.wav is the audio file from Speaker 03 speaking text a01 with the emotion "Freude" (Happiness).
Information about the speakers
03 - male, 31 years old
08 - female, 34 years
09 - female, 21 years
10 - male, 32 years
11 - male, 26 years
12 - male, 30 years
13 - female, 32 years
14 - female, 35 years
15 - male, 25 years
16 - female, 31 years
|
PNLPhub/P3-XL-WiC | ---
license: apache-2.0
---
|
allevelly/dataset | ---
license: creativeml-openrail-m
---
The data comes from tweets collected and classified through Crowdbreaks.org [Muller, Martin M., and Marcel Salathe. "Crowdbreaks: Tracking Health Trends Using Public Social Media Data and Crowdsourcing." Frontiers in public health 7 (2019).]. Tweets have been classified as pro-vaccine (1), neutral (0) or anti-vaccine (-1). |
vIVANsy/Buisness-Ideas | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-WillHeld__stereoset_zero-WillHeld__stereoset_zero-7a6673-2074067133 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- WillHeld/stereoset_zero
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: WillHeld/stereoset_zero
dataset_config: WillHeld--stereoset_zero
dataset_split: train
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: WillHeld/stereoset_zero
* Config: WillHeld--stereoset_zero
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@WillHeld](https://huggingface.co/WillHeld) for evaluating this model. |
jbrinkma/pile-300k | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: pile_set_name
dtype: string
splits:
- name: train
num_bytes: 1675060733
num_examples: 300000
download_size: 873058629
dataset_size: 1675060733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nev/nto2023-hack-dataset | ---
license: isc
---
|
bunny219/mae | ---
license: openrail
---
|
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3_sd2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2696882416
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
metredo085/taniacerto | ---
license: apache-2.0
---
|
ThWu/safety_benchmark | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: is_response_0_safe
dtype: bool
- name: is_response_1_safe
dtype: bool
- name: better_response_id
dtype: int64
- name: safer_response_id
dtype: int64
- name: better_response
dtype: string
- name: safer_response
dtype: string
- name: ranked_responses
sequence: string
splits:
- name: train
num_bytes: 1418163
num_examples: 1000
download_size: 736108
dataset_size: 1418163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yashtiwari/PaulMooney-Medical-ASR-Data | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: prompt
dtype: string
- name: speaker_id
dtype: int64
- name: path
dtype: audio
splits:
- name: train
num_bytes: 334850167.0
num_examples: 381
- name: validation
num_bytes: 295413087.0
num_examples: 385
- name: test
num_bytes: 5300069897.705
num_examples: 5895
download_size: 9155379864
dataset_size: 5930333151.705
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.