datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Intuit-GenSRF/jigsaw-unintended-bias-train-fr | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 688756878
num_examples: 1900136
download_size: 439186843
dataset_size: 688756878
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jigsaw-unintended-bias-train-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/partitioned_v3_standardized_028 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 31271468.45509263
num_examples: 58156
download_size: 5794647
dataset_size: 31271468.45509263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_028"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lang-uk/dragoman | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_rte_you_ye | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 22134
num_examples: 42
- name: train
num_bytes: 17532
num_examples: 34
download_size: 36416
dataset_size: 39666
---
# Dataset Card for "MULTI_VALUE_rte_you_ye"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KolaGang/processed_privacysumshort | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 755612066
num_examples: 261194
download_size: 214233590
dataset_size: 755612066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Adminhuggingface/LORA_ONE_DATA | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2493084.0
num_examples: 6
download_size: 2495157
dataset_size: 2493084.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LORA_ONE_DATA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO | ---
pretty_name: Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yanolja/Bookworm-10.7B-v0.4-DPO](https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T18:19:15.058025](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO/blob/main/results_2024-02-01T18-19-15.058025.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512039470522575,\n\
\ \"acc_stderr\": 0.032016258824533204,\n \"acc_norm\": 0.6543530523533914,\n\
\ \"acc_norm_stderr\": 0.03265904724752235,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5238117102691138,\n\
\ \"mc2_stderr\": 0.01570708203583901\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979282,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.656144194383589,\n\
\ \"acc_stderr\": 0.0047402292124734575,\n \"acc_norm\": 0.8442541326428998,\n\
\ \"acc_norm_stderr\": 0.0036187316588377092\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555495,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555495\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644234,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097413,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097413\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3925925925925926,\n \"acc_stderr\": 0.029773847012532967,\n \
\ \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.029773847012532967\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824846,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824846\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464078,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464078\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.485006518904824,\n\
\ \"acc_stderr\": 0.01276449320219326,\n \"acc_norm\": 0.485006518904824,\n\
\ \"acc_norm_stderr\": 0.01276449320219326\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066375,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368053,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368053\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5238117102691138,\n\
\ \"mc2_stderr\": 0.01570708203583901\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.0109951723180198\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5223654283548143,\n \
\ \"acc_stderr\": 0.013758699485911838\n }\n}\n```"
repo_url: https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|arc:challenge|25_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|arc:challenge|25_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|gsm8k|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|gsm8k|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hellaswag|10_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hellaswag|10_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-16-15.402421.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T18-19-15.058025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- '**/details_harness|winogrande|5_2024-02-01T18-16-15.402421.parquet'
- split: 2024_02_01T18_19_15.058025
path:
- '**/details_harness|winogrande|5_2024-02-01T18-19-15.058025.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T18-19-15.058025.parquet'
- config_name: results
data_files:
- split: 2024_02_01T18_16_15.402421
path:
- results_2024-02-01T18-16-15.402421.parquet
- split: 2024_02_01T18_19_15.058025
path:
- results_2024-02-01T18-19-15.058025.parquet
- split: latest
path:
- results_2024-02-01T18-19-15.058025.parquet
---
# Dataset Card for Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yanolja/Bookworm-10.7B-v0.4-DPO](https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T18:19:15.058025](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO/blob/main/results_2024-02-01T18-19-15.058025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512039470522575,
"acc_stderr": 0.032016258824533204,
"acc_norm": 0.6543530523533914,
"acc_norm_stderr": 0.03265904724752235,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720116,
"mc2": 0.5238117102691138,
"mc2_stderr": 0.01570708203583901
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979282,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840056
},
"harness|hellaswag|10": {
"acc": 0.656144194383589,
"acc_stderr": 0.0047402292124734575,
"acc_norm": 0.8442541326428998,
"acc_norm_stderr": 0.0036187316588377092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.02563425811555495,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.02563425811555495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429903,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429903
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644234,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097413,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097413
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.029773847012532967,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.029773847012532967
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824846,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824846
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464078,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464078
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.485006518904824,
"acc_stderr": 0.01276449320219326,
"acc_norm": 0.485006518904824,
"acc_norm_stderr": 0.01276449320219326
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066375,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368053,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368053
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720116,
"mc2": 0.5238117102691138,
"mc2_stderr": 0.01570708203583901
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.0109951723180198
},
"harness|gsm8k|5": {
"acc": 0.5223654283548143,
"acc_stderr": 0.013758699485911838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NobodyExistsOnTheInternet/SystemMessageContradictionsSharegptv2 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: system message
dtype: string
- name: reversed sysmsg
dtype: string
- name: reversed response
dtype: string
splits:
- name: train
num_bytes: 1285032417
num_examples: 90258
download_size: 413478568
dataset_size: 1285032417
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mnoukhov/summarize_from_feedback_tldr3_generated_20k_vllm_pythia1b_dpo_temp0.7_length128 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 37628107
num_examples: 19999
download_size: 23026580
dataset_size: 37628107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zeio/auto-pale | ---
language:
- en
license: apache-2.0
tags:
- gaming
annotation_creators:
- crowdsourced
language_creators:
- crowdsourced
pretty_name: pale
size_categories:
- 10K<n<100K
task_categories:
- text-generation
- text-classification
- automatic-speech-recognition
configs:
- config_name: quotes
data_files:
- split: train
path: quotes/*.parquet
default: true
- config_name: vanilla
data_files:
- split: train
path: vanilla/*.parquet
default: false
- config_name: annotated
data_files:
- split: train
path: annotated/*.parquet
default: false
- config_name: pulled
data_files:
- split: train
path: pulled/*.parquet
default: false
dataset_info:
- config_name: pulled
features:
- name: header
dtype: string
- name: subheader
dtype: string
- name: text
dtype: string
- name: sound
dtype:
audio:
sampling_rate: 44100
- name: champion
dtype: string
splits:
- name: train
num_bytes: 4621864509.2
num_examples: 67575
download_size: 2557617774
dataset_size: 4621864509.2
- config_name: quotes
features:
- name: header
dtype: string
- name: subheader
dtype: string
- name: text
dtype: string
- name: champion
dtype: string
splits:
- name: train
num_bytes: 2499768
num_examples: 31001
download_size: 947409
dataset_size: 2499768
- config_name: vanilla
features:
- name: header
dtype: string
- name: subheader
dtype: string
- name: text
dtype: string
- name: source
dtype: string
- name: champion
dtype: string
splits:
- name: train
num_bytes: 14430202
num_examples: 67575
download_size: 2675223
dataset_size: 14430202
- config_name: annotated
features:
- name: header
dtype: string
- name: subheader
dtype: string
- name: text
dtype: string
- name: source
dtype: string
- name: champion
dtype: string
- name: quote
dtype: bool
splits:
- name: train
num_bytes: 14339149
num_examples: 67575
download_size: 2681173
dataset_size: 14339149
---
# Dataset card for pale
## Table of contents
- [Dataset description](#dataset-description)
- [Dataset summary](#dataset-summary)
- [Dataset structure](#dataset-structure)
- [Dataset instance](#dataset-instance)
- [Dataset fields](#dataset-fields)
## Dataset description
- **Homepage:** [pale homepage](https://huggingface.co/datasets/zeio/pale)
- **Repository:** [pale repository](https://huggingface.co/datasets/zeio/pale)
- **Point of contact:** [Zeio Nara](mailto:zeionara@gmail.com)
- **Dataset version:** `30.10.2023`
### Dataset summary
This dataset contains league of legends champions' quotes parsed from [fandom](https://leagueoflegends.fandom.com).
See dataset usage example [at google colab](https://cutt.ly/3wEKDUI9).
The dataset is available in the following configurations:
1. `vanilla` - all data pulled from the website without significant modifications apart from the web page structure parsing;
1. `quotes` - truncated version of the corpus, which does't contain sound effects;
1. `annotated` - an extended version of the full configuration with a couple of additional columns with labels;
1. `pulled` - same as vanilla, but sound files have been pulled from the website, and `source` column is replaced with `sound`.
## Dataset structure
### Data instance
An example of an entry from the dataset is given below:
```json
{
"header": "Attack",
"subheader": "Attacking",
"text": "Kindred: \"The masks of the Kindred seek you!\"",
"source": "https://static.wikia.nocookie.net/leagueoflegends/images/1/12/Kindred_Original_Passive_Mark_Enemy_6.ogg/revision/latest?cb=20221204121356",
"champion": "kindred"
}
```
### Data fields
Each dataset entry therefore consists of the following fields:
- `header` - main category of the text;
- `subheader` - secondary category of the text (none in some cases);
- `text` - text said by the champion or description of sound made by the champion;
- `source` - link to the audio file (only `vanilla` configuration);
- `champion` - name of the champion in lowercase;
- `quote` - binary field displaying whether corresponding text contains quote or not (only `annotated` configuration);
- `sound` - audio data for the entry (only `pulled` configuration).
|
DataHammer/scimrc | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Scientific Emotional Dialogue
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a dataset for question answering on scientific research papers. It consists of 21.297 questions-answer-evidence pairs.
### Supported Tasks and Leaderboards
- question-answering: The dataset can be used to train a model for Scientific Question Answering. Success on this task is typically measured by achieving a high F1 score.
### Languages
English
## Dataset Structure
### Data Instances
A typical instance in the dataset:
```
{
"question": "What aim do the authors have by improving Wiki(GOLD) results?",
"answer": "The aim is not to tune their model specifically on this class hierarchy. They instead aim to present a framework which can be modified easily to any domain hierarchy and has acceptable out-of-the-box performances to any fine-grained dataset.",
"evidence": "The results for each class type are shown in Table TABREF19 , with some specific examples shown in Figure FIGREF18 . For the Wiki(gold) we quote the micro-averaged F-1 scores for the entire top level entity category. The total F-1 score on the OntoNotes dataset is 88%, and the total F-1 cross-validation score on the 112 class Wiki(gold) dataset is 53%. It is worth noting that one could improve Wiki(gold) results by training directly using this dataset. However, the aim is not to tune our model specifically on this class hierarchy. We instead aim to present a framework which can be modified easily to any domain hierarchy and has acceptable out-of-the-box performances to any fine-grained dataset. The results in Table TABREF19 (OntoNotes) only show the main 7 categories in OntoNotes which map to Wiki(gold) for clarity. The other categories (date, time, norp, language, ordinal, cardinal, quantity, percent, money, law) have F-1 scores between 80-90%, with the exception of time (65%)\nIt is worth noting that one could improve Wiki(GOLD) results by training directly using this dataset. However, the aim is not to tune our model specifically on this class hierarchy. We instead aim to present a framework which can be modified easily to any domain hierarchy and has acceptable out-of-the-box performances to any fine-grained dataset.",
"yes_no": false
}
```
|
heliosprime/twitter_dataset_1713224996 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19401
num_examples: 54
download_size: 17657
dataset_size: 19401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713224996"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gaodrew/sassy-aztec-qa-13k | ---
license: mit
---
|
EgilKarlsen/BGL_BERT_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211882766
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_BERT_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-2e778dac-2622-46c9-930e-6f9e705a27bf-2018 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
ateneoscsl/BUOD_articlescraper | ---
task_categories:
- summarization
language:
- tl
- en
---
# π BUOD Article Scraper
Authors: [James Esguerra](https://huggingface.co/jamesesguerra), [Julia Avila](), [Hazielle Bugayong](https://huggingface.co/0xhaz)
- Article Scraper for the KAMI-3000 dataset used in the BUOD [distilBART](https://huggingface.co/ateneoscsl/BUOD_distilBART_TM) and [bert2bert](https://huggingface.co/ateneoscsl/BUOD_bert2bert_TM) Transformer Models. This was also used for the text summarization tasks in the Filipino Language.
### Setup
1. Clone the repository.
```sh
# https
git clone https://github.com/avila-bugayong-esguerra/article-scraper.git
# or
# ssh
git clone git@github.com:avila-bugayong-esguerra/article-scraper.git
```
2. Change directory into project folder.
```sh
cd article_scraper
```
3. Create a virtual environment.
```sh
python -m venv venv
```
4. Activate the virtual environment.
```sh
# windows
\venv\Scripts\activate
# unix
source venv/bin/activate
```
5. Install the dependencies.
```sh
pip install -r article_scraper/requirements.txt
```
6. Change directory into the Scrapy project.
```sh
cd article_scraper
``` |
abhishekyo/train_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 958652
num_examples: 750
download_size: 172878
dataset_size: 958652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Multimodal-Fatima/OxfordPets_test_facebook_opt_125m_Visclues_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 121460903.375
num_examples: 3669
- name: fewshot_1_bs_16
num_bytes: 122822438.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 125536937.375
num_examples: 3669
- name: fewshot_5_bs_16
num_bytes: 128243714.375
num_examples: 3669
- name: fewshot_8_bs_16
num_bytes: 132312290.375
num_examples: 3669
download_size: 604694650
dataset_size: 630376283.875
---
# Dataset Card for "OxfordPets_test_facebook_opt_125m_Visclues_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chlee10__T3Q-platypus-SOLAR-10.7B-v1.0 | ---
pretty_name: Evaluation run of chlee10/T3Q-platypus-SOLAR-10.7B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chlee10/T3Q-platypus-SOLAR-10.7B-v1.0](https://huggingface.co/chlee10/T3Q-platypus-SOLAR-10.7B-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chlee10__T3Q-platypus-SOLAR-10.7B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T05:44:28.893890](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-platypus-SOLAR-10.7B-v1.0/blob/main/results_2024-03-12T05-44-28.893890.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6161023263287835,\n\
\ \"acc_stderr\": 0.03279616909498795,\n \"acc_norm\": 0.6233676222297954,\n\
\ \"acc_norm_stderr\": 0.03351736415113747,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5191281080553253,\n\
\ \"mc2_stderr\": 0.014792664772089011\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893447\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6443935471021709,\n\
\ \"acc_stderr\": 0.004777183508949811,\n \"acc_norm\": 0.8414658434574785,\n\
\ \"acc_norm_stderr\": 0.003644946730044617\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.0255250343824749,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.0255250343824749\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029258,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029258\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251742,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586227,\n \
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586227\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.04142313771996664,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.04142313771996664\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.02519018132760842,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.02519018132760842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381957,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381957\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493274,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493274\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.029520095697687765,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.029520095697687765\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215937,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215937\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440303,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440303\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5191281080553253,\n\
\ \"mc2_stderr\": 0.014792664772089011\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.01052998141183891\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20621683093252463,\n \
\ \"acc_stderr\": 0.011144364089781441\n }\n}\n```"
repo_url: https://huggingface.co/chlee10/T3Q-platypus-SOLAR-10.7B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|arc:challenge|25_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|gsm8k|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hellaswag|10_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-44-28.893890.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T05-44-28.893890.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- '**/details_harness|winogrande|5_2024-03-12T05-44-28.893890.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T05-44-28.893890.parquet'
- config_name: results
data_files:
- split: 2024_03_12T05_44_28.893890
path:
- results_2024-03-12T05-44-28.893890.parquet
- split: latest
path:
- results_2024-03-12T05-44-28.893890.parquet
---
# Dataset Card for Evaluation run of chlee10/T3Q-platypus-SOLAR-10.7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chlee10/T3Q-platypus-SOLAR-10.7B-v1.0](https://huggingface.co/chlee10/T3Q-platypus-SOLAR-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chlee10__T3Q-platypus-SOLAR-10.7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T05:44:28.893890](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-platypus-SOLAR-10.7B-v1.0/blob/main/results_2024-03-12T05-44-28.893890.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6161023263287835,
"acc_stderr": 0.03279616909498795,
"acc_norm": 0.6233676222297954,
"acc_norm_stderr": 0.03351736415113747,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5191281080553253,
"mc2_stderr": 0.014792664772089011
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893447
},
"harness|hellaswag|10": {
"acc": 0.6443935471021709,
"acc_stderr": 0.004777183508949811,
"acc_norm": 0.8414658434574785,
"acc_norm_stderr": 0.003644946730044617
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.0255250343824749,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.0255250343824749
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029258,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029258
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251742,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586227,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586227
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.04142313771996664,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.04142313771996664
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.02519018132760842,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.02519018132760842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381957,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381957
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493274,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215937,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215937
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440303,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440303
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5191281080553253,
"mc2_stderr": 0.014792664772089011
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.01052998141183891
},
"harness|gsm8k|5": {
"acc": 0.20621683093252463,
"acc_stderr": 0.011144364089781441
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Navarro20/robin | ---
license: openrail
---
|
NeuralNovel/Creative-Logic-v1 | ---
license: apache-2.0
---
|
thenaman/train.json | ---
license: mit
---
|
skvarre/movie_posters-100k-torchvision | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
sequence:
sequence:
sequence: float32
- name: title
dtype: string
- name: genres
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: overview
dtype: string
- name: popularity
dtype: float64
- name: release_date
dtype: string
- name: budget
dtype: int64
- name: revenue
dtype: int64
- name: tagline
dtype: string
- name: original_language
dtype: string
- name: runtime
dtype: int64
splits:
- name: train
num_bytes: 28368086498
num_examples: 95300
download_size: 26503296080
dataset_size: 28368086498
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "movie_posters-100k-torchvision"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arize-ai/movie_reviews_with_context_drift | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: sentiment-classification-reviews-with-drift
size_categories:
- 10K<n<100K
source_datasets:
- extended|imdb
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for `reviews_with_drift`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place.
### Supported Tasks and Leaderboards
`text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative).
### Languages
Text is mainly written in english.
## Dataset Structure
### Data Instances
#### default
An example of `training` looks as follows:
```json
{
'prediction_ts': 1650092416.0,
'age': 44,
'gender': 'female',
'context': 'movies',
'text': "An interesting premise, and Billy Drago is always good as a dangerous nut-bag (side note: I'd love to see Drago, Stephen McHattie and Lance Hendrikson in a flick together; talk about raging cheekbones!). The soundtrack wasn't terrible, either.<br /><br />But the acting--even that of such professionals as Drago and Debbie Rochon--was terrible, the directing worse (perhaps contributory to the former), the dialog chimp-like, and the camera work, barely tolerable. Still, it was the SETS that got a big 10 on my oy-vey scale. I don't know where this was filmed, but were I to hazard a guess, it would be either an open-air museum, or one of those re-enactment villages, where everything is just a bit too well-kept to do more than suggest the real Old West. Okay, so it was shot on a college kid's budget. That said, I could have forgiven one or two of the aforementioned faults. But taken all together, and being generous, I could not see giving it more than three stars.",
'label': 0
}
```
### Data Fields
#### default
The data fields are the same among all splits. An example of `training` looks as follows:
- `prediction_ts`: a `float` feature.
- `age`: an `int` feature.
- `gender`: a `string` feature.
- `context`: a `string` feature.
- `text`: a `string` feature.
- `label`: a `ClassLabel` feature, with possible values including negative(0) and positive(1).
### Data Splits
| name |training|validation|production |
|----------|-------:|---------:|----------:|
| default | 9916 | 2479 | 40079 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Contributions
Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset. |
MUisa/RonuAI | ---
license: openrail
---
|
mHossain/final_train_v2_330000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9121966.2
num_examples: 27000
- name: test
num_bytes: 1013551.8
num_examples: 3000
download_size: 4447462
dataset_size: 10135518.0
---
# Dataset Card for "final_train_v2_330000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thomasht86/ns3456_3451_clf_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: split
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 103363480
num_examples: 118557
- name: test
num_bytes: 25883559
num_examples: 29700
download_size: 115494808
dataset_size: 129247039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nthakur/miracl-raft-instruct-1-pos-4-neg-mistral | ---
dataset_info:
- config_name: ar
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 23085256
num_examples: 2761
download_size: 9582259
dataset_size: 23085256
- config_name: bn
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 34795181
num_examples: 2945
download_size: 10692946
dataset_size: 34795181
- config_name: en
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 31636295
num_examples: 5707
download_size: 13902931
dataset_size: 31636295
- config_name: es
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 18430799
num_examples: 3581
download_size: 7934347
dataset_size: 18430799
- config_name: fa
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 23124051
num_examples: 3298
download_size: 9006826
dataset_size: 23124051
- config_name: fi
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 10275011
num_examples: 1972
download_size: 5156216
dataset_size: 10275011
- config_name: fr
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 10022166
num_examples: 2004
download_size: 4815465
dataset_size: 10022166
- config_name: hi
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 20483583
num_examples: 2041
download_size: 6573144
dataset_size: 20483583
- config_name: id
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 38136877
num_examples: 7244
download_size: 16101961
dataset_size: 38136877
- config_name: ja
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 34740939
num_examples: 5743
download_size: 15926749
dataset_size: 34740939
- config_name: ko
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 6677931
num_examples: 1314
download_size: 3237577
dataset_size: 6677931
- config_name: ru
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 35062570
num_examples: 3804
download_size: 14049413
dataset_size: 35062570
- config_name: sw
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1031878
num_examples: 203
download_size: 527001
dataset_size: 1031878
- config_name: te
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1986018
num_examples: 206
download_size: 722739
dataset_size: 1986018
- config_name: th
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 34991199
num_examples: 3058
download_size: 11282773
dataset_size: 34991199
- config_name: zh
features:
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: reason
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 9474623
num_examples: 2214
download_size: 4861442
dataset_size: 9474623
configs:
- config_name: ar
data_files:
- split: train
path: ar/train-*
- config_name: bn
data_files:
- split: train
path: bn/train-*
- config_name: en
data_files:
- split: train
path: en/train-*
- config_name: es
data_files:
- split: train
path: es/train-*
- config_name: fa
data_files:
- split: train
path: fa/train-*
- config_name: fi
data_files:
- split: train
path: fi/train-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- config_name: hi
data_files:
- split: train
path: hi/train-*
- config_name: id
data_files:
- split: train
path: id/train-*
- config_name: ja
data_files:
- split: train
path: ja/train-*
- config_name: ko
data_files:
- split: train
path: ko/train-*
- config_name: ru
data_files:
- split: train
path: ru/train-*
- config_name: sw
data_files:
- split: train
path: sw/train-*
- config_name: te
data_files:
- split: train
path: te/train-*
- config_name: th
data_files:
- split: train
path: th/train-*
- config_name: zh
data_files:
- split: train
path: zh/train-*
---
# Dataset Card for "miracl-raft-instruct-1-pos-4-neg-mistral"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713025362 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15473
num_examples: 35
download_size: 11663
dataset_size: 15473
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713025362"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thermostatic/flowers | ---
license: mit
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset is a mix of the Capybara, Open-Platypus-Commercial and Wizard-Vicuna-Unfiltered datasets. As such, it can be used for commercial purposes. These base datasets provide a strong reasoning background on multiple fields of human knowledge, and that's why I chose all of these.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Thermostatic
- **Funded by [optional]:** Thermostatic
- **Shared by [optional]:** Thermostatic
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** No repository yet, will provide the scripts shortly
- **Paper [optional]:** No paper
- **Demo [optional]:** No demo yet
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yjernite/prof_images_blip__prompthero-openjourney-v4 | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: bartender
num_bytes: 4389977.0
num_examples: 100
- name: accountant
num_bytes: 3215772.0
num_examples: 100
- name: baker
num_bytes: 3986834.0
num_examples: 100
- name: artist
num_bytes: 3607453.0
num_examples: 100
- name: author
num_bytes: 3672416.0
num_examples: 100
- name: clergy
num_bytes: 3205746.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3248196.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3301158.0
num_examples: 100
- name: electrician
num_bytes: 4217689.0
num_examples: 100
- name: carpet_installer
num_bytes: 4563896.0
num_examples: 100
- name: civil_engineer
num_bytes: 3938254.0
num_examples: 100
- name: ceo
num_bytes: 2928809.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3598211.0
num_examples: 100
- name: dentist
num_bytes: 3152592.0
num_examples: 100
- name: butcher
num_bytes: 4539000.0
num_examples: 100
- name: courier
num_bytes: 4146333.0
num_examples: 100
- name: computer_programmer
num_bytes: 4075572.0
num_examples: 100
- name: correctional_officer
num_bytes: 3875009.0
num_examples: 100
- name: executive_assistant
num_bytes: 3060421.0
num_examples: 100
- name: designer
num_bytes: 3484381.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 4288164.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 4283347.0
num_examples: 100
- name: event_planner
num_bytes: 3610369.0
num_examples: 100
- name: cook
num_bytes: 3790487.0
num_examples: 100
- name: construction_worker
num_bytes: 4161361.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4432735.0
num_examples: 100
- name: electrical_engineer
num_bytes: 4664222.0
num_examples: 100
- name: career_counselor
num_bytes: 3458189.0
num_examples: 100
- name: clerk
num_bytes: 3289972.0
num_examples: 100
- name: director
num_bytes: 3198823.0
num_examples: 100
- name: cleaner
num_bytes: 3447924.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 3963881.0
num_examples: 100
- name: dental_assistant
num_bytes: 3092309.0
num_examples: 100
- name: architect
num_bytes: 3545898.0
num_examples: 100
- name: drywall_installer
num_bytes: 3554202.0
num_examples: 100
- name: childcare_worker
num_bytes: 3587994.0
num_examples: 100
- name: community_manager
num_bytes: 3682350.0
num_examples: 100
- name: carpenter
num_bytes: 4416973.0
num_examples: 100
- name: claims_appraiser
num_bytes: 3412701.0
num_examples: 100
- name: dispatcher
num_bytes: 4038038.0
num_examples: 100
- name: cashier
num_bytes: 3850933.0
num_examples: 100
- name: detective
num_bytes: 3164373.0
num_examples: 100
- name: engineer
num_bytes: 3757806.0
num_examples: 100
- name: dishwasher
num_bytes: 4884178.0
num_examples: 100
- name: credit_counselor
num_bytes: 3166833.0
num_examples: 100
- name: doctor
num_bytes: 3225393.0
num_examples: 100
- name: compliance_officer
num_bytes: 3275293.0
num_examples: 100
- name: aide
num_bytes: 3030976.0
num_examples: 100
- name: bus_driver
num_bytes: 4244558.0
num_examples: 100
- name: coach
num_bytes: 3508320.0
num_examples: 100
download_size: 194428990
dataset_size: 186236321.0
---
# Dataset Card for "prof_images_blip__prompthero-openjourney-v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aprab/pii-english | ---
dataset_info:
features:
- name: source_text
dtype: string
- name: target_text
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 25505470.759636868
num_examples: 29908
- name: test
num_bytes: 6767779.246773383
num_examples: 7946
download_size: 15472722
dataset_size: 32273250.00641025
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_malhajar__meditron-7b-chat | ---
pretty_name: Evaluation run of malhajar/meditron-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [malhajar/meditron-7b-chat](https://huggingface.co/malhajar/meditron-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_malhajar__meditron-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T12:44:32.691414](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__meditron-7b-chat/blob/main/results_2023-12-13T12-44-32.691414.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4088088785737693,\n\
\ \"acc_stderr\": 0.03432891874934368,\n \"acc_norm\": 0.412520814098851,\n\
\ \"acc_norm_stderr\": 0.03513603001068187,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48561313890109503,\n\
\ \"mc2_stderr\": 0.014556131200430611\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47440273037542663,\n \"acc_stderr\": 0.014592230885298964,\n\
\ \"acc_norm\": 0.507679180887372,\n \"acc_norm_stderr\": 0.014609667440892574\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5622385978888668,\n\
\ \"acc_stderr\": 0.004950973231188741,\n \"acc_norm\": 0.753734315873332,\n\
\ \"acc_norm_stderr\": 0.004299546103761425\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n\
\ \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.4774193548387097,\n\
\ \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.03903698647748441,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n\
\ \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4972477064220184,\n \"acc_stderr\": 0.021436998359765324,\n \"\
acc_norm\": 0.4972477064220184,\n \"acc_norm_stderr\": 0.021436998359765324\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47549019607843135,\n \"acc_stderr\": 0.035050931943487976,\n \"\
acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.035050931943487976\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \
\ \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.043171711948702556,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.043171711948702556\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831028,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831028\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212238,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5769230769230769,\n\
\ \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.5769230769230769,\n\
\ \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5159642401021711,\n\
\ \"acc_stderr\": 0.017870847506081738,\n \"acc_norm\": 0.5159642401021711,\n\
\ \"acc_norm_stderr\": 0.017870847506081738\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422622,\n\
\ \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422622\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.02818059632825929,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.02818059632825929\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45016077170418006,\n\
\ \"acc_stderr\": 0.02825666072336019,\n \"acc_norm\": 0.45016077170418006,\n\
\ \"acc_norm_stderr\": 0.02825666072336019\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607697,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320193,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320193\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n\
\ \"acc_stderr\": 0.011971507294982779,\n \"acc_norm\": 0.3259452411994785,\n\
\ \"acc_norm_stderr\": 0.011971507294982779\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4577114427860697,\n\
\ \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.4577114427860697,\n\
\ \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626545,\n\
\ \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626545\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48561313890109503,\n\
\ \"mc2_stderr\": 0.014556131200430611\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.007950942148339342\n }\n}\n```"
repo_url: https://huggingface.co/malhajar/meditron-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|arc:challenge|25_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|gsm8k|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hellaswag|10_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T12-44-32.691414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- '**/details_harness|winogrande|5_2023-12-13T12-44-32.691414.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T12-44-32.691414.parquet'
- config_name: results
data_files:
- split: 2023_12_13T12_44_32.691414
path:
- results_2023-12-13T12-44-32.691414.parquet
- split: latest
path:
- results_2023-12-13T12-44-32.691414.parquet
---
# Dataset Card for Evaluation run of malhajar/meditron-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [malhajar/meditron-7b-chat](https://huggingface.co/malhajar/meditron-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_malhajar__meditron-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T12:44:32.691414](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__meditron-7b-chat/blob/main/results_2023-12-13T12-44-32.691414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4088088785737693,
"acc_stderr": 0.03432891874934368,
"acc_norm": 0.412520814098851,
"acc_norm_stderr": 0.03513603001068187,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48561313890109503,
"mc2_stderr": 0.014556131200430611
},
"harness|arc:challenge|25": {
"acc": 0.47440273037542663,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.507679180887372,
"acc_norm_stderr": 0.014609667440892574
},
"harness|hellaswag|10": {
"acc": 0.5622385978888668,
"acc_stderr": 0.004950973231188741,
"acc_norm": 0.753734315873332,
"acc_norm_stderr": 0.004299546103761425
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4774193548387097,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.4774193548387097,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.03903698647748441,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.03903698647748441
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.035674713352125395,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.035674713352125395
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4972477064220184,
"acc_stderr": 0.021436998359765324,
"acc_norm": 0.4972477064220184,
"acc_norm_stderr": 0.021436998359765324
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.035050931943487976,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.035050931943487976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.043171711948702556,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.043171711948702556
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831028,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831028
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5159642401021711,
"acc_stderr": 0.017870847506081738,
"acc_norm": 0.5159642401021711,
"acc_norm_stderr": 0.017870847506081738
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422622,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422622
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45016077170418006,
"acc_stderr": 0.02825666072336019,
"acc_norm": 0.45016077170418006,
"acc_norm_stderr": 0.02825666072336019
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607697,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320193,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320193
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.011971507294982779,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.011971507294982779
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4577114427860697,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.4577114427860697,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626545,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626545
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48561313890109503,
"mc2_stderr": 0.014556131200430611
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339342
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dnovak232/sql_create_context-v4-mssql-instruct | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 43435483
num_examples: 78285
download_size: 13611891
dataset_size: 43435483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
McSpicyWithMilo/instruction-types-0.2split | ---
dataset_info:
features:
- name: instruction_type
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 28238
num_examples: 320
- name: test
num_bytes: 6791
num_examples: 80
download_size: 18706
dataset_size: 35029
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "instruction-types"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arain/UnitTest-Finetuning | ---
license: apache-2.0
---
Dataset Card for UnitTest-Finetuning Corpus
## Dataset Summary
UnitTest-Finetuning corpus is a dataset of 1.48 million instruction prompts for the development tasks in AD and TDD, across about 5.5K Java and 16.2K Python open-source projects on GitHub.
## Supported Tasks
- `Method-Test Dataset`: The dataset can be used to train a model for test completion and test generation tasks in Agile Development
- `Docstring-Test-Method Dataset`: The dataset can be used to train a model for test completion, test generation tasks and functional code generation in Test-Driven Development
## Languages
- Java programming language
- Python programming language
## Dataset Structure
### Data Instances
A data point represents a instruction prompt
```json
{
{
"instruction": "You are a professional java software engineer...",
"output": "```java\npackage com.google.api.client.util.escape;...```"
},
{
"instruction": "You are a professional java software engineer...",
"output": "```java\npackage com.google.api.client.util.escape;...```"
},
}
```
### Prompt type
- `Test Completion in AD`:
```
You are a professional {language} software engineer. An unit test class for a focal method is under development, your task is to generate a new test method for this test class to test new aspects that have not been covered before.
You will be given the following information of the unit test class and its focal method:
1. Source code of the focal method.
2. Source code of the focal class(Code that is not relevant to focal method's execution is filtered).
3. Source code of callee examples of the focal method.
4. Source code of unit test method that is already developed(With imports and dependencies).
You will ONLY return unit test code for the focal method including necessary imports and dependencies, make sure it compile without errors, and use reflection to invoke private methods.
Note that NO additional explanations required.
Here are the information of the focal method:
1. Source code of the focal method.
{focal_method}
2. Source code of the focal class(Codes that are may not related to focal method are filtered).
{focal_class}
3. Source code of callee examples of the focal method.
{callee_example}
4. Source code of unit test method that is already developed(With imports and dependencies).
{test_example}
```
- `Test Generation in AD`:
```
You are a professional {language} software engineer. You are asked to generate a complete test class for a focal method in a focal class.
You will be given the following information of the focal method:
1. Source code of the focal method.
2. Source code of the focal class(Code that is not relevant to focal method's execution is filtered).
3. Source code of callee examples of the focal method.
4. Source code of unit test method that is already developed(With imports and dependencies).
You will ONLY return unit test code for the focal method including necessary imports and dependencies, make sure it compile without errors, and use reflection to invoke private methods.
Note that no additional explanations required.
Here are the information of the focal method:
1. Source code of the focal method.
{focal_method}
2. Source code of the focal class(Codes that are may not related to focal method are filtered).
{focal_class}
3. Source code of callee examples of the focal method.
{callee_example}
4. Source code of unit test method that is already developed(With imports and dependencies).
{test_example}
Please note that the test class you return should include multiple test cases covering different functionalities. There is no upper limit on the number of test cases, but you need to ensure that the test cases provide high test coverage and test extreme and special cases of the code as much as possible.
```
- `Test Completion in TDD`:
```
You are a professional {language} software engineer proficient in utilizing the Test-Driven Development (TDD) methodology. Your development process consists of two steps: first, generating test cases based on natural language requirements, and second, creating functional code.
Currently, you're embarking on the first step and a unit test class for a requirement is under development, your task is to generate a new test method for this test class to test new aspects that have not been covered before.
You'll be provided with the following information:
1. A development requirement described in natural language.
2. Source code of unit test method that is already developed(With imports and dependencies).
You will ONLY return unit test code including necessary imports and dependencies, make sure it compile without errors, use reflection to invoke private methods, and won't test scenarios beyond the stated development requirement.
Note that no additional explanations required.
Here are the information:
1. A development requirement described in natural language.
{requirement}
2. Source code of unit test method that is already developed(With imports and dependencies).
{test_example}
```
- `Test Generation in TDD`:
```
You are a professional {language} software engineer proficient in utilizing the Test-Driven Development (TDD) methodology. Your development process consists of two steps: first, generating test cases based on natural language requirements, and second, creating functional code.
Currently, you're embarking on the first step, where you'll derive a complete test class for a focal method from a development requirement described in natural language.
You will ONLY return unit test code including necessary imports and dependencies, make sure it compile without errors, use reflection to invoke private methods, and won't test scenarios beyond the stated development requirement.
Note that no additional explanations required.
Here are the development requirement described in natural language:
{requirement}
Please note that the test class you return should include multiple test cases covering different functionalities. There is no upper limit on the number of test cases, but you need to ensure that the test cases provide high test coverage and test extreme and special cases of the code as much as possible.
```
- `Functional Code Generation in TDD`:
```
You are a professional {language} software engineer proficient in utilizing the Test-Driven Development (TDD) methodology. Your development process consists of two steps: first, generating test cases based on natural language requirements, and second, creating functional code that ensures passing those test cases.
Currently, you're embarking on the Second step, which involves generating functional code that ensures passing of all tests and can be directly executed.
You'll be provided with the following information:
1. A development requirement described in natural language.
2. Test cases generated by you in the first step of TDD development based on the aforementioned requirement.
You will ONLY return functional code including necessary imports and dependencies, make sure it compile without errors, use reflection to invoke private methods.
Note that no additional explanations required.
Here are the information:
1. A development requirement described in natural language.
{requirement}
2. Test cases generated by you in the first step of TDD development based on the aforementioned requirement.
{test_example}
```
### Citation Information
```bibtex
``` |
sanchit-gandhi/concatenated_librispeech | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 707889.0
num_examples: 1
download_size: 0
dataset_size: 707889.0
---
# Dataset Card for "concatenated_librispeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cvlt-mao/bc5cdr | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence:
class_label:
names:
'0': O
'1': B-Chemical
'2': B-Disease
'3': I-Disease
'4': I-Chemical
splits:
- name: train
num_bytes: 1888772
num_examples: 5228
- name: validation
num_bytes: 1881130
num_examples: 5330
- name: test
num_bytes: 2000887
num_examples: 5865
download_size: 1118925
dataset_size: 5770789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-sft-full | ---
pretty_name: Evaluation run of BEE-spoke-data/zephyr-220m-sft-full
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/zephyr-220m-sft-full](https://huggingface.co/BEE-spoke-data/zephyr-220m-sft-full)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-sft-full\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T04:33:51.710520](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-sft-full/blob/main/results_2024-01-05T04-33-51.710520.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2635455099117063,\n\
\ \"acc_stderr\": 0.030898680264922977,\n \"acc_norm\": 0.2646935495456357,\n\
\ \"acc_norm_stderr\": 0.03169524466378701,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156493,\n \"mc2\": 0.43225660929564824,\n\
\ \"mc2_stderr\": 0.015552475830622107\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20648464163822525,\n \"acc_stderr\": 0.011828865619002316,\n\
\ \"acc_norm\": 0.2525597269624573,\n \"acc_norm_stderr\": 0.012696728980207706\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2757418840868353,\n\
\ \"acc_stderr\": 0.004459740315490865,\n \"acc_norm\": 0.29028082055367455,\n\
\ \"acc_norm_stderr\": 0.004529642828546404\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.03547854198560828,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.03547854198560828\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.03476599607516479,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.03476599607516479\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386715,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.11724137931034483,\n \"acc_stderr\": 0.026808974229173797,\n\
\ \"acc_norm\": 0.11724137931034483,\n \"acc_norm_stderr\": 0.026808974229173797\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3225806451612903,\n\
\ \"acc_stderr\": 0.02659308451657228,\n \"acc_norm\": 0.3225806451612903,\n\
\ \"acc_norm_stderr\": 0.02659308451657228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n\
\ \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n\
\ \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.01440029642922559,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.01440029642922559\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.0252616912197295,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.0252616912197295\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n\
\ \"acc_stderr\": 0.022268196258783225,\n \"acc_norm\": 0.18971061093247588,\n\
\ \"acc_norm_stderr\": 0.022268196258783225\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042114,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503786,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503786\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n\
\ \"acc_stderr\": 0.011092789056875245,\n \"acc_norm\": 0.2522816166883963,\n\
\ \"acc_norm_stderr\": 0.011092789056875245\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22875816993464052,\n \"acc_stderr\": 0.01699272346546623,\n \
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.01699272346546623\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789427,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789427\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156493,\n \"mc2\": 0.43225660929564824,\n\
\ \"mc2_stderr\": 0.015552475830622107\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.014045126130978601\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401502001\n }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/zephyr-220m-sft-full
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-33-51.710520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-33-51.710520.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- '**/details_harness|winogrande|5_2024-01-05T04-33-51.710520.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T04-33-51.710520.parquet'
- config_name: results
data_files:
- split: 2024_01_05T04_33_51.710520
path:
- results_2024-01-05T04-33-51.710520.parquet
- split: latest
path:
- results_2024-01-05T04-33-51.710520.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/zephyr-220m-sft-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/zephyr-220m-sft-full](https://huggingface.co/BEE-spoke-data/zephyr-220m-sft-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-sft-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T04:33:51.710520](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-sft-full/blob/main/results_2024-01-05T04-33-51.710520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2635455099117063,
"acc_stderr": 0.030898680264922977,
"acc_norm": 0.2646935495456357,
"acc_norm_stderr": 0.03169524466378701,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156493,
"mc2": 0.43225660929564824,
"mc2_stderr": 0.015552475830622107
},
"harness|arc:challenge|25": {
"acc": 0.20648464163822525,
"acc_stderr": 0.011828865619002316,
"acc_norm": 0.2525597269624573,
"acc_norm_stderr": 0.012696728980207706
},
"harness|hellaswag|10": {
"acc": 0.2757418840868353,
"acc_stderr": 0.004459740315490865,
"acc_norm": 0.29028082055367455,
"acc_norm_stderr": 0.004529642828546404
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.03547854198560828,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.03547854198560828
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.03476599607516479,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.03476599607516479
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.11724137931034483,
"acc_stderr": 0.026808974229173797,
"acc_norm": 0.11724137931034483,
"acc_norm_stderr": 0.026808974229173797
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3225806451612903,
"acc_stderr": 0.02659308451657228,
"acc_norm": 0.3225806451612903,
"acc_norm_stderr": 0.02659308451657228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922559,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922559
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0252616912197295,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0252616912197295
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783225,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783225
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503786,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2522816166883963,
"acc_stderr": 0.011092789056875245,
"acc_norm": 0.2522816166883963,
"acc_norm_stderr": 0.011092789056875245
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.01699272346546623,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.01699272346546623
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789427,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789427
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156493,
"mc2": 0.43225660929564824,
"mc2_stderr": 0.015552475830622107
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978601
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401502001
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tasksource/logical-entailment | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: A
dtype: string
- name: B
dtype: string
- name: E
dtype: int64
- name: H1
dtype: int64
- name: H2
dtype: int64
- name: H3
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 9803153
num_examples: 99876
- name: test
num_bytes: 550241
num_examples: 5000
- name: validation
num_bytes: 548346
num_examples: 5000
download_size: 2505053
dataset_size: 10901740
---
https://github.com/google-deepmind/logical-entailment-dataset
```
@inproceedings{
evans2018can,
title={Can Neural Networks Understand Logical Entailment?},
author={Richard Evans and David Saxton and David Amos and Pushmeet Kohli and Edward Grefenstette},
booktitle={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=SkZxCk-0Z},
}
```
|
MasterThesisCBS/XSum_NO | ---
license: cc-by-4.0
language:
- 'no'
- nb
tags:
- summarization
pretty_name: XSUM Norwegian
task_categories:
- text-generation
- summarization
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: timestamp
dtype: string
- name: body
dtype: string
- name: lead
dtype: string
- name: body_length
dtype: float64
- name: summary
dtype: string
- name: prompt_train
dtype: string
- name: prompt_test
dtype: string
splits:
- name: train
num_bytes: 284661834
num_examples: 64070
- name: test
num_bytes: 14882449
num_examples: 3373
download_size: 186192491
dataset_size: 299544283
---
# XSUM NO
A norwegian summarization dataset custom made for evaluation or fine-tuning of GPT models.
## Data Collection
Data was scraped from Aftenposten.no and Vg.no, and the summarization column is represented by the title and ingress.
## How to Use
```python
from datasets import load_dataset
data = load_dataset("MasterThesisCBS/XSum_NO")
```
### Dataset Curators
[John Oskar Holmen Skjeldrum](mailto:josk18ad@student.cbs.dk) and [Peder Tanberg](mailto:peha28ae@student.cbs.dk) |
awacke1/LOINC-Panels-and-Forms | ---
license: mit
---
|
open-llm-leaderboard/details_Menouar__saqr-7b-beta | ---
pretty_name: Evaluation run of Menouar/saqr-7b-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Menouar/saqr-7b-beta](https://huggingface.co/Menouar/saqr-7b-beta) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Menouar__saqr-7b-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T12:49:44.046455](https://huggingface.co/datasets/open-llm-leaderboard/details_Menouar__saqr-7b-beta/blob/main/results_2024-02-18T12-49-44.046455.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27030982140899557,\n\
\ \"acc_stderr\": 0.03111036577540486,\n \"acc_norm\": 0.2704987678522067,\n\
\ \"acc_norm_stderr\": 0.031811806028838624,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.3938162400030715,\n\
\ \"mc2_stderr\": 0.014166543524460336\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42150170648464164,\n \"acc_stderr\": 0.014430197069326016,\n\
\ \"acc_norm\": 0.4778156996587031,\n \"acc_norm_stderr\": 0.014597001927076133\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5774746066520613,\n\
\ \"acc_stderr\": 0.004929517011508222,\n \"acc_norm\": 0.776140211113324,\n\
\ \"acc_norm_stderr\": 0.004159773209765884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678316,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678316\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.026199808807561915,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.026199808807561915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1935483870967742,\n\
\ \"acc_stderr\": 0.02247525852553606,\n \"acc_norm\": 0.1935483870967742,\n\
\ \"acc_norm_stderr\": 0.02247525852553606\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1625615763546798,\n \"acc_stderr\": 0.025960300064605576,\n\
\ \"acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.025960300064605576\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18134715025906736,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.18134715025906736,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655078,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655078\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341937,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341937\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21651376146788992,\n \"acc_stderr\": 0.01765871059444314,\n \"\
acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.01765871059444314\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29901960784313725,\n \"acc_stderr\": 0.032133257173736156,\n \"\
acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.032133257173736156\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928313,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928313\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3418803418803419,\n\
\ \"acc_stderr\": 0.031075028526507755,\n \"acc_norm\": 0.3418803418803419,\n\
\ \"acc_norm_stderr\": 0.031075028526507755\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803835,\n\
\ \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.01431099954796144,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.01431099954796144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02378858355165854,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02378858355165854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n\
\ \"acc_stderr\": 0.011092789056875238,\n \"acc_norm\": 0.2522816166883963,\n\
\ \"acc_norm_stderr\": 0.011092789056875238\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031225,\n\
\ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031225\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.01812022425148459,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.01812022425148459\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.3938162400030715,\n\
\ \"mc2_stderr\": 0.014166543524460336\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7056037884767167,\n \"acc_stderr\": 0.012809427134352408\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \
\ \"acc_stderr\": 0.007423390519873232\n }\n}\n```"
repo_url: https://huggingface.co/Menouar/saqr-7b-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|arc:challenge|25_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|gsm8k|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hellaswag|10_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T12-49-44.046455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T12-49-44.046455.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- '**/details_harness|winogrande|5_2024-02-18T12-49-44.046455.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T12-49-44.046455.parquet'
- config_name: results
data_files:
- split: 2024_02_18T12_49_44.046455
path:
- results_2024-02-18T12-49-44.046455.parquet
- split: latest
path:
- results_2024-02-18T12-49-44.046455.parquet
---
# Dataset Card for Evaluation run of Menouar/saqr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Menouar/saqr-7b-beta](https://huggingface.co/Menouar/saqr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Menouar__saqr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T12:49:44.046455](https://huggingface.co/datasets/open-llm-leaderboard/details_Menouar__saqr-7b-beta/blob/main/results_2024-02-18T12-49-44.046455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27030982140899557,
"acc_stderr": 0.03111036577540486,
"acc_norm": 0.2704987678522067,
"acc_norm_stderr": 0.031811806028838624,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.3938162400030715,
"mc2_stderr": 0.014166543524460336
},
"harness|arc:challenge|25": {
"acc": 0.42150170648464164,
"acc_stderr": 0.014430197069326016,
"acc_norm": 0.4778156996587031,
"acc_norm_stderr": 0.014597001927076133
},
"harness|hellaswag|10": {
"acc": 0.5774746066520613,
"acc_stderr": 0.004929517011508222,
"acc_norm": 0.776140211113324,
"acc_norm_stderr": 0.004159773209765884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678316,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678316
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.026199808807561915,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.026199808807561915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.14,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.14,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1935483870967742,
"acc_stderr": 0.02247525852553606,
"acc_norm": 0.1935483870967742,
"acc_norm_stderr": 0.02247525852553606
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.025960300064605576,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.025960300064605576
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18134715025906736,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.18134715025906736,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655078,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655078
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341937,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341937
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21651376146788992,
"acc_stderr": 0.01765871059444314,
"acc_norm": 0.21651376146788992,
"acc_norm_stderr": 0.01765871059444314
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928313,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928313
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3418803418803419,
"acc_stderr": 0.031075028526507755,
"acc_norm": 0.3418803418803419,
"acc_norm_stderr": 0.031075028526507755
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803835,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.01431099954796144,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.01431099954796144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2522816166883963,
"acc_stderr": 0.011092789056875238,
"acc_norm": 0.2522816166883963,
"acc_norm_stderr": 0.011092789056875238
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.028332959514031225,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.028332959514031225
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.01812022425148459,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.01812022425148459
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.3938162400030715,
"mc2_stderr": 0.014166543524460336
},
"harness|winogrande|5": {
"acc": 0.7056037884767167,
"acc_stderr": 0.012809427134352408
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.007423390519873232
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FSMBench/fsmbench_what_will_be_the_state_12K_image | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: fsm_id
dtype: string
- name: fsm_json
dtype: string
- name: difficulty_level
dtype: int64
- name: transition_matrix
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: substring_index
dtype: int64
- name: number_of_states
dtype: int64
- name: number_of_alphabets
dtype: int64
- name: state_alpha_combo
dtype: string
- name: image
dtype: image
splits:
- name: validation
num_bytes: 2046048983.0
num_examples: 12800
download_size: 53085449
dataset_size: 2046048983.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
Svenni551/toxic-full-uncensored-v1.0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: output
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1674833
num_examples: 570
download_size: 847554
dataset_size: 1674833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Shravanig/fire_detection_final | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Fire
'1': Normal
'2': Smoke
splits:
- name: train
num_bytes: 160965820.64
num_examples: 6060
- name: validation
num_bytes: 85813019.0
num_examples: 756
- name: test
num_bytes: 93348677.0
num_examples: 759
download_size: 891539912
dataset_size: 340127516.64
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
rokset3/slimpajama | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: redpajama_set_name
dtype: string
splits:
- name: train
num_bytes: 23874206724
num_examples: 5489000
download_size: 13962151299
dataset_size: 23874206724
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "slimpajama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fufufukakaka/pokemon_party_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2017764
num_examples: 19698
download_size: 569950
dataset_size: 2017764
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/FGVC_Aircraft_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_classes_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 44840
num_examples: 100
download_size: 12275
dataset_size: 44840
---
# Dataset Card for "FGVC_Aircraft_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_187 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 783516224
num_examples: 153872
download_size: 796511672
dataset_size: 783516224
---
# Dataset Card for "chunk_187"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssbuild/moss_sft_002 | ---
license: apache-2.0
---
|
zhangshuai507653/testdataset12138 | ---
license: bigscience-openrail-m
---
|
jmoney54378256438905/cybersharter-v3 | ---
license: cc-by-nd-4.0
---
|
mizunorlk/cariuchav3 | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_stsb_fixin_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 12230
num_examples: 54
- name: test
num_bytes: 7214
num_examples: 36
- name: train
num_bytes: 20574
num_examples: 84
download_size: 36723
dataset_size: 40018
---
# Dataset Card for "MULTI_VALUE_stsb_fixin_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v2_390000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9160829.1
num_examples: 27000
- name: test
num_bytes: 1017869.9
num_examples: 3000
download_size: 4463175
dataset_size: 10178699.0
---
# Dataset Card for "final_train_v2_390000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
houck2040/research_news | ---
license: mit
---
|
zliu333/truck_at_port3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 67546834.0
num_examples: 45
download_size: 67529720
dataset_size: 67546834.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
matlok/python-text-copilot-training-instruct | ---
license:
- other
pretty_name: >-
python copilot instructions on how to code using alpaca and yaml
dataset_info:
- config_name: view_01_transformers_src
splits:
- name: view_01_transformers_src
- config_name: view_02_pytorch_fsdp
splits:
- name: view_02_pytorch_fsdp
- config_name: view_03_deepspeed_runtime
splits:
- name: view_03_deepspeed_runtime
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_01_transformers_src
data_files:
- split: view_01_transformers_src
path: files/lok-python-copilot-text.instruct-v1_00000053.parquet
- config_name: view_02_pytorch_fsdp
data_files:
- split: view_02_pytorch_fsdp
path: files/lok-python-copilot-text.instruct-v1_00000040.parquet
- config_name: view_03_deepspeed_runtime
data_files:
- split: view_03_deepspeed_runtime
path: files/lok-python-copilot-text.instruct-v1_00000019.parquet
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-text.instruct-v1_00000002.parquet
size_categories:
- 1M<n<10M
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- coding
- task
- prompt
- response
- yaml
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-generation
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Instructions on How to Code using Alpaca and Yaml
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1737704
- Size: 28.6 GB
- Data type: text
- Format: Introduction on code usage using alpaca and yaml response
### Schema
The instruction alpaca text with yaml response is in the **desc** column:
```json
{
"active": "bool",
"args": "string",
"args_len": "float64",
"audio_file": "string",
"audio_path": "string",
"class_bases": "string",
"class_name": "string",
"code": "string",
"code_len": "float64",
"desc": "string",
"desc_docstr": "string",
"desc_docstr_len": "float64",
"desc_len": "int64",
"docstr": "string",
"docstr_len": "int64",
"file_path": "string",
"file_type": "string",
"function_names": "string",
"gen_bytes": "int64",
"gen_data_type": "string",
"gen_mode": "string",
"gen_size": "int64",
"gen_valid": "string",
"height": "int64",
"image_file": "string",
"image_path": "string",
"method_names": "string",
"name": "string",
"num_all_bases": "int64",
"num_bases": "int64",
"num_classes": "int64",
"num_functions": "float64",
"num_imports": "int64",
"num_methods": "float64",
"prompts": "string",
"raises": "string",
"raises_len": "float64",
"recsize": "int64",
"repo": "string",
"returns": "string",
"returns_len": "float64",
"size": "int64",
"src_object": "string",
"sub_file": "string",
"total_objects": "int64",
"usage": "string",
"usages": "string",
"width": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-text-copilot-training-instruct", data_dir="files")
```
|
houck2040/artisci | ---
license: mit
---
|
xiaojuan0920/cskg_2 | ---
license: openrail
---
|
corypaik/coda | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- en
language_bcp47:
- en-US
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: CoDa
paperswithcode_id: coda
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-scoring
task_ids:
- text-scoring-other-distribution-prediction
---
# Dataset Card for CoDa
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [nala-cub/coda](https://github.com/nala-cub/coda)
- **Paper:** [The World of an Octopus: How Reporting Bias Influences a Language Model's Perception of Color](https://arxiv.org/abs/2110.08182)
- **Point of Contact:** [Cory Paik](cory.paik@colorado.edu)
### Dataset Summary
*The Color Dataset* (CoDa) is a probing dataset to evaluate the representation of visual properties in language models. CoDa consists of color distributions for 521 common objects, which are split into 3 groups. We denote these groups as Single, Multi, and Any, which represents the typical object of each group.
The default configuration of CoDa uses 10 CLIP-style templates (e.g. "A photo of a [object]"), and 10 cloze-style templates (e.g. "Everyone knows most [object] are
[color]." )
### Supported Tasks and Leaderboards
This version of the dataset consists of the filtered and templated examples as cloze style questions. See the [GitHub](https://github.com/nala-cub/coda) repo for the raw data (e.g. unfiltered annotations) as well as example usage with GPT-2, RoBERTa, ALBERT, and CLIP.
### Languages
The text in the dataset is in English. The associated BCP-47 code is `en-US`.
## Dataset Structure
### Data Instances
An example looks like this:
```json
{
"text": "All rulers are [MASK].",
"label": [
0.0181818176, 0.0363636352, 0.3077272773, 0.0181818176, 0.0363636352,
0.086363636, 0.0363636352, 0.0363636352, 0.0363636352, 0.086363636,
0.301363647
],
"template_group": 1,
"template_idx": 0,
"class_id": "/m/0hdln",
"display_name": "Ruler",
"object_group": 2,
"ngram": "ruler"
}
```
### Data Fields
- `text`: The templated example. What this is depends on the value of `template_group`.
- `template_group=0`: A CLIP style example. There are no `[MASK]` tokens in these examples.
- `template_group=1`: A cloze style example. Note that all templates have `[MASK]` as the last word, but in most cases, the period should be included.
- `label`: A list of probability values for the 11 colors. Note that these are sorted by the alphabetic order of the 11 colors (black, blue, brown, gray, green, orange, pink, purple, red, white, yellow).
- `template_group`: Type of template, `0` corresponds to A CLIP style template (`clip-imagenet`), and `1` corresponds to A cloze style templates (`text-masked`).
- `template_idx`: The index of the template out of all templates
- `class_id`: The Corresponding [OpenImages v6](https://storage.googleapis.com/openimages/web/index.html) `ClassID`.
- `display_name`: The Corresponding [OpenImages v6](https://storage.googleapis.com/openimages/web/index.html) `DisplayName`.
- `object_group`: Object Group, values correspond to `Single`, `Multi`, and `Any`.
- `ngram`: Corresponding n-gram used for lookups.
### Data Splits
Object Splits:
| Group | All | Train | Valid | Test |
| ------ | --- | ----- | ----- | ---- |
| Single | 198 | 118 | 39 | 41 |
| Multi | 208 | 124 | 41 | 43 |
| Any | 115 | 69 | 23 | 23 |
| Total | 521 | 311 | 103 | 107 |
Example Splits:
| Group | All | Train | Valid | Test |
| ------ | ----- | ----- | ----- | ---- |
| Single | 3946 | 2346 | 780 | 820 |
| Multi | 4146 | 2466 | 820 | 860 |
| Any | 2265 | 1352 | 460 | 453 |
| Total | 10357 | 6164 | 2060 | 2133 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
CoDa is licensed under the Apache 2.0 license.
### Citation Information
```
@misc{paik2021world,
title={The World of an Octopus: How Reporting Bias Influences a Language Model's Perception of Color},
author={Cory Paik and StΓ©phane Aroca-Ouellette and Alessandro Roncone and Katharina Kann},
year={2021},
eprint={2110.08182},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
Amrit333/Amrit | ---
license: other
---
|
kaleemWaheed/twitter_dataset_1713093633 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9455
num_examples: 22
download_size: 9664
dataset_size: 9455
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1712943567 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13607
num_examples: 31
download_size: 10289
dataset_size: 13607
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712943567"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bambadij/Tweet_sentiment_analysis_Distilbert | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 1712789
num_examples: 7999
- name: eval
num_bytes: 472000
num_examples: 2000
download_size: 505986
dataset_size: 2184789
---
# Dataset Card for "Tweet_sentiment_analysis_Distilbert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thewillonline/reddit-sarcasm | ---
license: unknown
---
|
Cheetor1996/Rika_Minami | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Rika Minami** from **Highschool of the Dead**
- *Trained with anime (full-final-pruned) model*
- *Works best with ALL, MIDD, OUTD, and OUTALL LoRA weight block weights, and with 0.7+ weights* |
flamesbob/Duality_style | ---
license: creativeml-openrail-m
---
`duality_style, art by duality_style` this will give a monochrome, wings/feathers, flowers, and opposite reflection look.
License This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies:
You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here |
joey234/mmlu-college_chemistry | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4914
num_examples: 5
- name: test
num_bytes: 363948
num_examples: 100
download_size: 72165
dataset_size: 368862
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-college_chemistry"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f38ddf8e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1337
dataset_size: 180
---
# Dataset Card for "f38ddf8e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BroCsChan/Dawn | ---
license: c-uda
---
|
Multimodal-Fatima/Caltech101_with_background_test_facebook_opt_2.7b_Attributes_ns_6084 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 100846738.5
num_examples: 6084
- name: fewshot_1_bs_16
num_bytes: 102174531.5
num_examples: 6084
- name: fewshot_3_bs_16
num_bytes: 104837834.5
num_examples: 6084
- name: fewshot_5_bs_16
num_bytes: 107498126.5
num_examples: 6084
- name: fewshot_8_bs_16
num_bytes: 111469795.5
num_examples: 6084
download_size: 498513923
dataset_size: 526827026.5
---
# Dataset Card for "Caltech101_with_background_test_facebook_opt_2.7b_Attributes_ns_6084"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/enamine_leadlike | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 31490993396
num_examples: 672148662
download_size: 12563051169
dataset_size: 31490993396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "enamine_leadlike"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weqweasdas/preference_dataset_mixture | ---
dataset_info:
features:
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_score
dtype: float64
- name: chosen_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 806285615
num_examples: 256426
download_size: 461120853
dataset_size: 806285615
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "preference_dataset_mixture"
The dataset used to train weqweasdas/RM-Gemma-7B . See the model page for details. |
pkavumba/balanced-copa | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: BCOPA
size_categories:
- unknown
source_datasets:
- extended|copa
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
---
# Dataset Card for "Balanced COPA"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://balanced-copa.github.io/](https://balanced-copa.github.io/)
- **Repository:** [Balanced COPA](https://github.com/Balanced-COPA/Balanced-COPA)
- **Paper:** [When Choosing Plausible Alternatives, Clever Hans can be Clever](https://aclanthology.org/D19-6004/)
- **Point of Contact:** [@pkavumba](https://github.com/pkavumba)
### Dataset Summary
Bala-COPA: An English language Dataset for Training Robust Commonsense Causal Reasoning Models
The Balanced Choice of Plausible Alternatives dataset is a benchmark for training machine learning models that are robust to superficial cues/spurious correlations. The dataset extends the COPA dataset(Roemmele et al. 2011) with mirrored instances that mitigate against token-level superficial cues in the original COPA answers. The superficial cues in the original COPA datasets result from an unbalanced token distribution between the correct and the incorrect answer choices, i.e., some tokens appear more in the correct choices than the incorrect ones. Balanced COPA equalizes the token distribution by adding mirrored instances with identical answer choices but different labels.
The details about the creation of Balanced COPA and the implementation of the baselines are available in the paper.
Balanced COPA language en
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
- English
## Dataset Structure
### Data Instances
An example of 'validation' looks as follows.
```
{
"id": 1,
"premise": "My body cast a shadow over the grass.",
"choice1": "The sun was rising.",
"choice2": "The grass was cut.",
"question": "cause",
"label": 1,
"mirrored": false,
}
{
"id": 1001,
"premise": "The garden looked well-groomed.",
"choice1": "The sun was rising.",
"choice2": "The grass was cut.",
"question": "cause",
"label": 1,
"mirrored": true,
}
```
### Data Fields
The data fields are the same among all splits.
#### en
- `premise`: a `string` feature.
- `choice1`: a `string` feature.
- `choice2`: a `string` feature.
- `question`: a `string` feature.
- `label`: a `int32` feature.
- `id`: a `int32` feature.
- `mirrored`: a `bool` feature.
### Data Splits
| validation | test |
| ---------: | ---: |
| 1,000 | 500 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
```
@inproceedings{kavumba-etal-2019-choosing,
title = "When Choosing Plausible Alternatives, Clever Hans can be Clever",
author = "Kavumba, Pride and
Inoue, Naoya and
Heinzerling, Benjamin and
Singh, Keshav and
Reisert, Paul and
Inui, Kentaro",
booktitle = "Proceedings of the First Workshop on Commonsense Inference in Natural Language Processing",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D19-6004",
doi = "10.18653/v1/D19-6004",
pages = "33--42",
abstract = "Pretrained language models, such as BERT and RoBERTa, have shown large improvements in the commonsense reasoning benchmark COPA. However, recent work found that many improvements in benchmarks of natural language understanding are not due to models learning the task, but due to their increasing ability to exploit superficial cues, such as tokens that occur more often in the correct answer than the wrong one. Are BERT{'}s and RoBERTa{'}s good performance on COPA also caused by this? We find superficial cues in COPA, as well as evidence that BERT exploits these cues.To remedy this problem, we introduce Balanced COPA, an extension of COPA that does not suffer from easy-to-exploit single token cues. We analyze BERT{'}s and RoBERTa{'}s performance on original and Balanced COPA, finding that BERT relies on superficial cues when they are present, but still achieves comparable performance once they are made ineffective, suggesting that BERT learns the task to a certain degree when forced to. In contrast, RoBERTa does not appear to rely on superficial cues.",
}
@inproceedings{roemmele2011choice,
title={Choice of plausible alternatives: An evaluation of commonsense causal reasoning},
author={Roemmele, Melissa and Bejan, Cosmin Adrian and Gordon, Andrew S},
booktitle={2011 AAAI Spring Symposium Series},
year={2011},
url={https://people.ict.usc.edu/~gordon/publications/AAAI-SPRING11A.PDF},
}
```
### Contributions
Thanks to [@pkavumba](https://github.com/pkavumba) for adding this dataset.
|
joey234/mmlu-high_school_psychology-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 252392
num_examples: 545
download_size: 146713
dataset_size: 252392
---
# Dataset Card for "mmlu-high_school_psychology-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kiringodhwani/msp10 | ---
dataset_info:
features:
- name: From
sequence: string
- name: Sent
sequence: string
- name: To
sequence: string
- name: Cc
sequence: string
- name: Subject
sequence: string
- name: Attachment
sequence: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 9444843
num_examples: 7772
download_size: 3887624
dataset_size: 9444843
---
# Dataset Card for "msp10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigcode/the-stack-v2-train-smol-ids | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: The-Stack-v2
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids: []
extra_gated_prompt: |-
## Terms of Use for The Stack v2
The Stack v2 dataset is a collection of source code in over 600 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. Downloading the dataset in bulk requires a an agreement with SoftwareHeritage and INRIA. Contact [datasets@softwareheritage.org](mailto:datasets@softwareheritage.org?subject=TheStackV2%20request%20for%20dataset%20access%20information) for more information.
2. If you are using the dataset to train models you must adhere to the SoftwareHeritage [principles for language model training](https://www.softwareheritage.org/2023/10/19/swh-statement-on-llm-for-code/).
3. The Stack v2 is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack v2 must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
4. The Stack v2 is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack v2 to the most recent usable version.
By clicking on "Access repository" below, you accept that your contact information (email address and username) can be shared with the dataset maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: directory_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: files
list:
- name: blob_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: language
dtype: string
- name: length_bytes
dtype: int64
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: src_encoding
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: alphanum_fraction
dtype: float32
- name: alpha_fraction
dtype: float32
- name: num_lines
dtype: int32
- name: avg_line_length
dtype: float32
- name: max_line_length
dtype: int32
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 112773164389
num_examples: 48348592
download_size: 72680443362
dataset_size: 112773164389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# The Stack v2
<center>
<img src="https://huggingface.co/datasets/bigcode/admin_private/resolve/main/thestackv2_banner.png" alt="Stackv2" width="900" height="600">
</center>
## Dataset Description
- **Homepage:** https://www.bigcode-project.org/
- **Repository:** https://github.com/bigcode-project
- **Paper:** [Link](https://huggingface.co/papers/2402.19173)
- **Point of Contact:** contact@bigcode-project.org
The dataset consists of 4 versions:
- [`bigcode/the-stack-v2`](https://huggingface.co/datasets/bigcode/the-stack-v2): the full "The Stack v2" dataset
- [`bigcode/the-stack-v2-dedup`](https://huggingface.co/datasets/bigcode/the-stack-v2-dedup): based on the `bigcode/the-stack-v2` but further near-deduplicated
- [`bigcode/the-stack-v2-train-full-ids`](https://huggingface.co/datasets/bigcode/the-stack-v2-train-full-ids): based on the `bigcode/the-stack-v2-dedup` dataset but further filtered with heuristics and spanning 600+ programming languages. The data is grouped into repositories.
- [`bigcode/the-stack-v2-train-smol-ids`](https://huggingface.co/datasets/bigcode/the-stack-v2-train-smol-ids): based on the `bigcode/the-stack-v2-dedup` dataset but further filtered with heuristics and spanning 17 programming languages. The data is grouped into repositories. **<-- you are here**
**These datasets only contain the SWHIDs to download the code files and not the content of the files itself. See examples below to see how to download content. We are working on making the training datasets available in the coming weeks.**
The Stack v2 is significantly larger than v1:
||The Stack v1|The Stack v2|
|-|-|-|
| full | 6.4TB | 67.5TB |
| dedup | 2.9TB | 32.1TB |
| train (full) | ~200B tokens | ~900B tokens |
### Changelog
|Release|Description|
|-|-|
| v2.0.1 | Version bump without modifications to the dataset. StarCoder2 was trained on this version |
| v2.0 | Initial release of the Stack v2 |
### Dataset Summary
The Stack v2 contains over 3B files in 600+ programming and markup languages. The dataset was created as part of the [BigCode Project](https://www.bigcode-project.org/), an open scientific collaboration working on the responsible development of Large Language Models for Code (Code LLMs). The Stack serves as a pre-training dataset for Code LLMs, i.e., code-generating AI systems which enable the synthesis of programs from natural language descriptions as well as other from code snippets.
This dataset is derived from the Software Heritage archive, the largest public archive of software source code and accompanying development history. Software Heritage is an open, non profit initiative to collect, preserve, and share the source code of all publicly available software, launched by Inria, in partnership with UNESCO. We acknowledge Software Heritage for providing access to this invaluable resource. For more details, visit the [Software Heritage website](https://www.softwareheritage.org).
### Languages
The `smol` dataset contains 39 languages.
```
Ant Build System, AsciiDoc, C, C#, C++, CMake, Dockerfile, Go, Go Module, Gradle, Groovy, HTML, INI, Java, Java Properties, JavaScript, JSON, JSON with Comments, Kotlin, Lua, M4Sugar, Makefile, Markdown, Maven POM, PHP, Python, R, RDoc, reStructuredText, RMarkdown, Ruby, Rust, Shell, SQL, Swift, Text, TOML, TypeScript, YAML
```
### How to use it
```python
from datasets import load_dataset
# full dataset (file IDs only)
ds = load_dataset("bigcode/the-stack-v2-train-smol-ids", split="train")
# dataset streaming (will only download the data as needed)
ds = load_dataset("bigcode/the-stack-v2-train-smol-ids", streaming=True, split="train")
for sample in iter(ds):
print(sample)
```
#### Downloading the file contents
The file contents are stored in the Software Heritage S3 bucket to ensure data compliance. Downloading data in bulk requires an agreement with SoftwareHeritage and INRIA as stated in the dataset agreement.
Make sure to configure your environment with your [AWS credentials](https://awscli.amazonaws.com/v2/documentation/api/latest/reference/configure/index.html#examples).
```bash
pip install smart_open[s3]
```
```python
import os
import boto3
from smart_open import open
from datasets import load_dataset
session = boto3.Session(
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"])
s3 = session.client("s3")
def download_contents(files):
for file in files:
s3_url = f"s3://softwareheritage/content/{file['blob_id']}"
with open(s3_url, "rb", compression=".gz", transport_params={"client": s3}) as fin:
file["content"] = fin.read().decode(file["src_encoding"])
return {"files": files}
ds = load_dataset("bigcode/the-stack-v2-train-smol-ids", split="train", streaming=True)
ds = ds.map(lambda row: download_contents(row["files"]))
for row in ds:
for file in row["files"]:
print(file["content"])
break
```
## Dataset Structure
### Data Fields
* `blob_id` (`string`): Software Heritage (SWH) ID of the file on AWS S3.
* `directory_id` (`string`): SWH ID of the root directory of the repository.
* `path` (`string`): The file path within the repository.
* `content_id` (`string`): SWH content ID.
* `detected_licenses` (`string[]`): List of licenses (SPDX) detected by ScanCode.
* `license_type` (`string`): Inferred license type (`permissive` or `no_license`).
* `repo_name` (`string`): Repository name on GitHub.
* `snapshot_id` (`string`): SWH snapshot ID.
* `revision_id` (`string`): SWH revision (commit) ID.
* `branch_name` (`string`): Repository branch name.
* `visit_date` (`timestamp[ns]`): SWH crawl (snapshot) timestamp.
* `revision_date` (`timestamp[ns]`): SWH revision (commit) timestamp.
* `committer_date` (`timestamp[ns]`): SWH revision (commit) timestamp reported by the committer.
* `github_id` (`int64`): GitHub identifier for the repository.
* `star_events_count` (`int64`): number of stars calculated from GHArchive events.
* `fork_events_count` (`int64`): number of forks calculated from GHArchive events.
* `gha_license_id` (`string`): GHArchive SPDX license identifier, `None` if the repo is missing.
* `gha_event_created_at` (`timestamp[ns]`): Timestamp of the latest event on GHArchive for this repository.
* `gha_created_at` (`timestamp[ns]`): Timestamp of repository creation on GitHub, `None` if the repo is missing.
* `gha_language` (`string`): Repository's primary programming language on GitHub, `None` if the repo is missing.
* `src_encoding` (`string`): Original encoding of the file content befre converting to UTF-8.
* `language` (`string`): Programming language of the file, detected by `go-enry / linguist`.
* `is_vendor` (`bool`): Indicator of vendor file (external library), detected by `go-enry`.
* `is_generated` (`bool`): Indicator of generated file (external library), detected by `go-enry`.
* `length_bytes` (`int64`): Length of the file content in UTF-8 bytes.
* `extension` (`string`): File extension.
### Data Splits
The dataset has no splits and all data is loaded as train split by default. If you want to setup a custom train-test split beware that dataset contains a lot of near-duplicates which can cause leakage into the test split.
## Dataset Creation
For more information on the dataset creation pipeline please refer to the [technical report](https://huggingface.co/papers/2402.19173).
### Curation Rationale
One of the challenges faced by researchers working on code LLMs is the lack of openness and transparency around the development of these systems. Most prior works described the high-level data collection process but did not release the training data. It is therefore difficult for other researchers to fully reproduce these models and understand what kind of pre-training data leads to high-performing code LLMs. By releasing an open large-scale code dataset we hope to make training of code LLMs more reproducible.
### Source Data
#### Data Collection
3.28B unique files belonging to 104.2M github repositories were collected by traversing the Software Heritage [2023-09-06](https://docs.softwareheritage.org/devel/swh-dataset/graph/dataset.html#graph-dataset-2023-09-06) graph dataset.
Additional repository-level metadata was collected from [GitHub Archive](https://www.gharchive.org/) data up to 2023-09-14.
The total uncompressed size of all files is 67.53TB.
Near-deduplication was implemented in the pre-processing pipeline on top of exact deduplication.
Roughly 40% of permissively licensed files were (near-)duplicates.
The following are not stored:
* Files that cannot contribute to training code: binary, empty, could not be decoded
* Files larger than 10MB
**Training Datasets**: For the training datasets the programming languages were filtered further to 17 and 600+ for the `the-stack-v2-smol-ids` and `the-stack-v2-full-ids` dataset, respecively. In addition, heuristics were applied to further increase the quality of the dataset. The code files are also grouped into repositories to allow to pretrain with full repository context. For more details see the [technical report](https://huggingface.co/papers/2402.19173).
##### License detection
We extract repository-level license information from [GH Archive](https://www.gharchive.org/) for all repositories with matching names in the SWH dataset.
When the repo-level license is not available, i.e., for 96.93\% of repositories, we use the [ScanCode Toolkit](https://github.com/nexB/scancode-toolkit) to detect file-level licenses as follows:
* Find all filenames that could contain a license (e.g., LICENSE, MIT.txt, Apache2.0) or contain a reference to the license (e.g., README.md, GUIDELINES);
* Apply ScanCode's license detection to the matching files and gather the SPDX IDs of the detected licenses;
* Propagate the detected licenses to all files that have the same base path within the repository as the license file.
The licenses we consider permissive are listed [here](https://huggingface.co/datasets/bigcode/the-stack-v2/blob/main/license_stats.csv).
This list was compiled from the licenses approved by the [Blue Oak Council](https://blueoakcouncil.org/list),
as well as licenses categorized as "Permissive" or "Public Domain" by [ScanCode](https://scancode-licensedb.aboutcode.org/).
#### Who are the source language producers?
The source (code) language producers are users of GitHub that created unique repository names up until 2023-09-06 (cutoff date).
### Personal and Sensitive Information
The released dataset may contain sensitive information such as emails, IP addresses, and API/ssh keys that have previously been published to public repositories on GitHub. Deduplication has helped to reduce the amount of sensitive data that may exist. In the event that the dataset contains personal information, researchers should only use public, non-personal information in support of conducting and publishing their [open-access](https://en.wikipedia.org/wiki/Open_access) research. Personal information should not be used for spamming purposes, including sending unsolicited emails or selling of personal information. Complaints, removal requests, and "do not contact" requests can be sent to contact@bigcode-project.org.
### Opting out of The Stack v2
We are giving developers the ability to have their code removed from the dataset upon request. The process for submitting and enacting removal requests will keep evolving throughout the project as we receive feedback and build up more data governance tools.
You can check if your code is in The Stack v2 with the following ["Am I In The Stack?" Space](https://huggingface.co/spaces/bigcode/in-the-stack). If you'd like to have your data removed from the dataset follow the [instructions on GitHub](https://github.com/bigcode-project/opt-out-v2).
## Considerations for Using the Data
### Social Impact of Dataset
The Stack v2 is an output of the BigCode Project. BigCode aims to be responsible by design and by default. The project is conducted in the spirit of Open Science, focused on the responsible development of LLMs for code.
With the release of The Stack v2, we aim to increase access, reproducibility, and transparency of code LLMs in the research community. Work to de-risk and improve on the implementation of ethical best practices of code LLMs is conducted in various BigCode working groups. The Legal, Ethics, and Governance working group has explored topics such as licensing (including copyleft and the intended use of permissively licensed code), attribution of generated code to original code, rights to restrict processing, the inclusion of Personally Identifiable Information (PII), and risks of malicious code, among other topics. This work is ongoing as of October 25th, 2022.
We expect code LLMs to enable people from diverse backgrounds to write higher quality code and develop low-code applications. Mission-critical software could become easier to maintain as professional developers are guided by code-generating systems on how to write more robust and efficient code. While the social impact is intended to be positive, the increased accessibility of code LLMs comes with certain risks such as over-reliance on the generated code and long-term effects on the software development job market.
A broader impact analysis relating to Code LLMs can be found in section 7 of this [paper](https://arxiv.org/abs/2107.03374). An in-depth risk assessments for Code LLMs can be found in section 4 of this [paper](https://arxiv.org/abs/2207.14157).
### Discussion of Biases
The code collected from GitHub does not contain demographic information or proxy information about the demographics. However, it is not without risks,
as the comments within the code may contain harmful or offensive language, which could be learned by the models.
Widely adopted programming languages like C and Javascript are overrepresented compared to niche programming languages like Julia and Scala. Some programming languages such as SQL, Batchfile, TypeScript are less likely to be permissively licensed (4% vs the average 10%). This may result in a biased representation of those languages. Permissively licensed files also tend to be longer.
The majority of natural language present in code from GitHub is English.
### Other Known Limitations
One of the current limitations of The Stack v2 is that scraped HTML for websites may not be compliant with Web Content Accessibility Guidelines ([WCAG](https://www.w3.org/WAI/standards-guidelines/wcag/)). This could have an impact on HTML-generated code that may introduce web accessibility issues.
The training dataset could contain malicious code and/or the model could be used to generate malware or ransomware.
To the best of our knowledge, all files contained in the dataset are licensed with one of the permissive licenses (see list in [Licensing information](#licensing-information)) or no license.
The accuracy of license attribution is limited by the accuracy of GHArchive and ScanCode Toolkit.
Any mistakes should be reported to BigCode Project for review and follow-up as needed.
## Additional Information
### Dataset Curators
1. Harm de Vries, ServiceNow Research, harm.devries@servicenow.com
2. Leandro von Werra, Hugging Face, leandro@huggingface.co
### Licensing Information
The Stack v2 is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack v2 must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the dataset can be found [here](https://huggingface.co/datasets/bigcode/the-stack-v2/blob/main/license_stats.csv).
### Citation Information
```bash
@misc{lozhkov2024starcoder,
title={StarCoder 2 and The Stack v2: The Next Generation},
author={Anton Lozhkov and Raymond Li and Loubna Ben Allal and Federico Cassano and Joel Lamy-Poirier and Nouamane Tazi and Ao Tang and Dmytro Pykhtar and Jiawei Liu and Yuxiang Wei and Tianyang Liu and Max Tian and Denis Kocetkov and Arthur Zucker and Younes Belkada and Zijian Wang and Qian Liu and Dmitry Abulkhanov and Indraneil Paul and Zhuang Li and Wen-Ding Li and Megan Risdal and Jia Li and Jian Zhu and Terry Yue Zhuo and Evgenii Zheltonozhskii and Nii Osae Osae Dade and Wenhao Yu and Lucas KrauΓ and Naman Jain and Yixuan Su and Xuanli He and Manan Dey and Edoardo Abati and Yekun Chai and Niklas Muennighoff and Xiangru Tang and Muhtasham Oblokulov and Christopher Akiki and Marc Marone and Chenghao Mou and Mayank Mishra and Alex Gu and Binyuan Hui and Tri Dao and Armel Zebaze and Olivier Dehaene and Nicolas Patry and Canwen Xu and Julian McAuley and Han Hu and Torsten Scholak and Sebastien Paquet and Jennifer Robinson and Carolyn Jane Anderson and Nicolas Chapados and Mostofa Patwary and Nima Tajbakhsh and Yacine Jernite and Carlos MuΓ±oz Ferrandis and Lingming Zhang and Sean Hughes and Thomas Wolf and Arjun Guha and Leandro von Werra and Harm de Vries},
year={2024},
eprint={2402.19173},
archivePrefix={arXiv},
primaryClass={cs.SE}
}
``` |
CyberHarem/ganaha_hibiki_theidolmster | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ganaha_hibiki/ζι£θ¦ιΏ (THE iDOLM@STER)
This is the dataset of ganaha_hibiki/ζι£θ¦ιΏ (THE iDOLM@STER), containing 500 images and their tags.
The core tags of this character are `long_hair, black_hair, ponytail, blue_eyes, fang, earrings, antenna_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 507.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ganaha_hibiki_theidolmster/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 345.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ganaha_hibiki_theidolmster/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1167 | 695.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ganaha_hibiki_theidolmster/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 469.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ganaha_hibiki_theidolmster/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1167 | 893.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ganaha_hibiki_theidolmster/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ganaha_hibiki_theidolmster',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, animal, hamster, shorts, solo, open_mouth, sandals, bracelet, :d, hoop_earrings |
| 1 | 19 |  |  |  |  |  | 1girl, solo, open_mouth, hoop_earrings, smile, blush, bracelet, hair_ribbon |
| 2 | 16 |  |  |  |  |  | 1girl, open_mouth, solo, navel, bracelet, midriff, necklace, shorts, :d, belt |
| 3 | 8 |  |  |  |  |  | 1girl, dress, smile, solo, elbow_gloves, jewelry, bare_shoulders, open_mouth, ribbon |
| 4 | 10 |  |  |  |  |  | 1girl, smile, solo, cleavage, striped_bikini, high_ponytail, medium_breasts, open_mouth, hoop_earrings, looking_at_viewer, navel, water, barefoot, one_eye_closed |
| 5 | 9 |  |  |  |  |  | 1girl, hair_flower, smile, solo, kimono, open_mouth, new_year |
| 6 | 7 |  |  |  |  |  | 1girl, apron, open_mouth, maid_headdress, solo, blush, enmaided, smile, white_thighhighs |
| 7 | 6 |  |  |  |  |  | 1girl, bangs, blush, looking_at_viewer, solo, white_background, hair_between_eyes, hair_ribbon, short_shorts, simple_background, very_long_hair, collarbone, open_mouth, short_sleeves, :d, cleavage, medium_breasts, necklace, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | animal | hamster | shorts | solo | open_mouth | sandals | bracelet | :d | hoop_earrings | smile | blush | hair_ribbon | navel | midriff | necklace | belt | dress | elbow_gloves | jewelry | bare_shoulders | ribbon | cleavage | striped_bikini | high_ponytail | medium_breasts | looking_at_viewer | water | barefoot | one_eye_closed | hair_flower | kimono | new_year | apron | maid_headdress | enmaided | white_thighhighs | bangs | white_background | hair_between_eyes | short_shorts | simple_background | very_long_hair | collarbone | short_sleeves | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:----------|:---------|:-------|:-------------|:----------|:-----------|:-----|:----------------|:--------|:--------|:--------------|:--------|:----------|:-----------|:-------|:--------|:---------------|:----------|:-----------------|:---------|:-----------|:-----------------|:----------------|:-----------------|:--------------------|:--------|:-----------|:-----------------|:--------------|:---------|:-----------|:--------|:-----------------|:-----------|:-------------------|:--------|:-------------------|:--------------------|:---------------|:--------------------|:-----------------|:-------------|:----------------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | | | | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | | | X | X | X | | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | X | X | | | | | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | | X | X | | | | X | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | X | X | | | X | | | X | X | | | X | | | | | | | X | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
irds/clinicaltrials_2021_trec-ct-2022 | ---
pretty_name: '`clinicaltrials/2021/trec-ct-2022`'
viewer: false
source_datasets: ['irds/clinicaltrials_2021']
task_categories:
- text-retrieval
---
# Dataset Card for `clinicaltrials/2021/trec-ct-2022`
The `clinicaltrials/2021/trec-ct-2022` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clinicaltrials#clinicaltrials/2021/trec-ct-2022).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- For `docs`, use [`irds/clinicaltrials_2021`](https://huggingface.co/datasets/irds/clinicaltrials_2021)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/clinicaltrials_2021_trec-ct-2022', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in π€ Dataset format.
|
ekolasky/SciREXForCustomLEDConsol | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: result_labels
sequence: int64
- name: grouping_vector
sequence:
sequence: int64
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 22188308
num_examples: 260
- name: validation
num_bytes: 3629858
num_examples: 44
download_size: 4000584
dataset_size: 25818166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
sheik21/audiomatheuz | ---
license: openrail
---
|
mxronga/nvidia_steer_yo | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
---
Yoruba translation of the Nvidia steer dataset |
AIGym/ai-tech-articles | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 177472659
num_examples: 17092
download_size: 80029866
dataset_size: 177472659
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DBQ/Blickers.Product.prices.France | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: France - Blickers - Product-level price list
tags:
- webscraping
- ecommerce
- Blickers
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 2820149
num_examples: 7489
download_size: 1352484
dataset_size: 2820149
---
# Blickers web scraped data
## About the website
Blickers operates in the **Ecommerce industry** of the Europe, Middle East, and Africa (EMEA) region, with specific focus on **France**. This industry encapsulates any commercial transactions conducted electronically on the internet including buying, selling, and exchanging of goods or services. Especially in France, the **Ecommerce sector** is booming, driven by strong consumer behavior shift towards online shopping and digital transactions. The observed dataset contains insightful **Ecommerce product-list page (PLP) data on Blickers in France**, offering a comprehensive overview of customer activities, preferences, and the performance of various products. This data has the potential to guide strategic decision-making to optimize conversions.
## Link to **dataset**
[France - Blickers - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Blickers%20Product-prices%20France/r/recrjX2FST51AHd7c)
|
financeart/EmiTalks2 | ---
license: mit
---
|
heliosprime/twitter_dataset_1713049863 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11857
num_examples: 26
download_size: 9107
dataset_size: 11857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713049863"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Millena123/Rose | ---
license: openrail
---
|
gaygaaa/KEYWORDS | ---
license: mit
---
|
Renanriozz/Renanzzz | ---
license: afl-3.0
---
|
CyberHarem/magdeburg_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of magdeburg/γγ―γγγ«γ―/ι©¬ζ ΌεΎ·ε ‘ (Azur Lane)
This is the dataset of magdeburg/γγ―γγγ«γ―/ι©¬ζ ΌεΎ·ε ‘ (Azur Lane), containing 15 images and their tags.
The core tags of this character are `black_hair, horns, long_hair, breasts, multicolored_hair, red_eyes, bangs, hair_between_eyes, red_hair, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 20.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 12.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 25.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 18.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 35.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/magdeburg_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, navel, open_mouth, smile, looking_at_viewer, black_bikini, blush, nail_polish, thighhighs, cleavage, cloud, o-ring_bikini, outdoors, see-through, sky, tied_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | open_mouth | smile | looking_at_viewer | black_bikini | blush | nail_polish | thighhighs | cleavage | cloud | o-ring_bikini | outdoors | see-through | sky | tied_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:--------------------|:---------------|:--------|:--------------|:-------------|:-----------|:--------|:----------------|:-----------|:--------------|:------|:-------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Praghxx/Tetin | ---
license: openrail
---
|
izzy-lazerson/audio-test-metadata | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: file_info
dtype: string
splits:
- name: train
num_bytes: 9172805.0
num_examples: 40
download_size: 8703874
dataset_size: 9172805.0
---
# Dataset Card for "audio-test-metadata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rosenpp/asterdata | ---
license: mit
language:
- bg
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Seetha/visual_cs | ---
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: string
- name: Non-performance
dtype: int64
- name: Investors
dtype: int64
- name: Customers
dtype: int64
- name: Employees
dtype: int64
- name: Society
dtype: int64
splits:
- name: train
num_bytes: 269
num_examples: 5
download_size: 3579
dataset_size: 269
---
|
davidkim205/ko_common_gen | ---
language:
- ko
---
# News Common Gen
## μκ°
λ΄μ€ λ°μ΄ν°λ₯Ό μ΄μ©νμ¬ μ μν common gen λ°μ΄ν°μ
. μ΄ 4,639κ°.
## ꡬ쑰
```jsonl
{
"concept_set": "concept set: {ν΄μμ΄, μ€λ§νΈμμΉ, μ΄λλ€λ, λ§μ΄ν¬, LED μμ₯ μ±μ₯}",
"ending0": "λ§μ΄ν¬λ‘ μ€λ§νΈμμΉκ° LED μμ₯ μ±μ₯μ μ΄λλ€λ ν΄μμ΄λ€.",
"ending1": "μ€λ§νΈμμΉκ° λ§μ΄ν¬λ‘ LED μμ₯ μ±μ₯μ μ΄λλ€λ ν΄μμ΄λ€.",
"ending2": "λ§μ΄ν¬λ‘ μ΄λλ€λ LED μμ₯ μ±μ₯μ μ€λ§νΈμμΉκ° ν΄μμ΄λ€.",
"ending3": "μ€λ§νΈμμΉκ° LED μμ₯ μ±μ₯μ λ§μ΄ν¬λ‘ μ΄λλ€λ ν΄μμ΄λ€.",
"label": 1
}
{...}
``` |
joey234/mmlu-philosophy-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 28918
num_examples: 44
download_size: 22058
dataset_size: 28918
---
# Dataset Card for "mmlu-philosophy-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic | ---
pretty_name: Evaluation run of Voicelab/trurl-2-13b-academic
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T13:54:25.329738](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-10-26T13-54-25.329738.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.38265520134228187,\n\
\ \"em_stderr\": 0.004977455184961271,\n \"f1\": 0.45275587248322363,\n\
\ \"f1_stderr\": 0.004784339979418239,\n \"acc\": 0.4373808097665532,\n\
\ \"acc_stderr\": 0.010248109703374565\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.38265520134228187,\n \"em_stderr\": 0.004977455184961271,\n\
\ \"f1\": 0.45275587248322363,\n \"f1_stderr\": 0.004784339979418239\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \
\ \"acc_stderr\": 0.008590089300511146\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Voicelab/trurl-2-13b-academic
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T13_54_25.329738
path:
- '**/details_harness|drop|3_2023-10-26T13-54-25.329738.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T13-54-25.329738.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T13_54_25.329738
path:
- '**/details_harness|gsm8k|5_2023-10-26T13-54-25.329738.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T13-54-25.329738.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T13_54_25.329738
path:
- '**/details_harness|winogrande|5_2023-10-26T13-54-25.329738.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T13-54-25.329738.parquet'
- config_name: results
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- results_2023-09-21T21-26-52.608718.parquet
- split: 2023_10_26T13_54_25.329738
path:
- results_2023-10-26T13-54-25.329738.parquet
- split: latest
path:
- results_2023-10-26T13-54-25.329738.parquet
---
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Voicelab/trurl-2-13b-academic
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T13:54:25.329738](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-10-26T13-54-25.329738.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.38265520134228187,
"em_stderr": 0.004977455184961271,
"f1": 0.45275587248322363,
"f1_stderr": 0.004784339979418239,
"acc": 0.4373808097665532,
"acc_stderr": 0.010248109703374565
},
"harness|drop|3": {
"em": 0.38265520134228187,
"em_stderr": 0.004977455184961271,
"f1": 0.45275587248322363,
"f1_stderr": 0.004784339979418239
},
"harness|gsm8k|5": {
"acc": 0.10917361637604246,
"acc_stderr": 0.008590089300511146
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/semeval-task-8-a-mono-gltr-ppl | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: model
dtype: string
- name: source
dtype: string
- name: id
dtype: int64
- name: gltr
sequence: int64
- name: ppl
sequence: float64
splits:
- name: train
num_bytes: 245302117
num_examples: 83829
- name: val
num_bytes: 105434420
num_examples: 35928
- name: test
num_bytes: 11023757
num_examples: 5000
download_size: 209455821
dataset_size: 361760294
---
# Dataset Card for "semeval-task-8-a-mono-gltr-ppl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.