datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
3mrys/uni_co | ---
license: apache-2.0
---
|
distilled-from-one-sec-cv12/chunk_56 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1395673060
num_examples: 271955
download_size: 1426718157
dataset_size: 1395673060
---
# Dataset Card for "chunk_56"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlFrauch/arxive_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 100081932758
num_examples: 1903579
download_size: 16157271684
dataset_size: 100081932758
---
# Dataset Card for "arxive_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cloudyu__60B_MoE_Coder_v3 | ---
pretty_name: Evaluation run of cloudyu/60B_MoE_Coder_v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/60B_MoE_Coder_v3](https://huggingface.co/cloudyu/60B_MoE_Coder_v3) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__60B_MoE_Coder_v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T04:01:02.016455](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__60B_MoE_Coder_v3/blob/main/results_2024-02-10T04-01-02.016455.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7507692268285616,\n\
\ \"acc_stderr\": 0.028851716073379694,\n \"acc_norm\": 0.7546718582183198,\n\
\ \"acc_norm_stderr\": 0.029402819641764666,\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.017502858577371258,\n \"mc2\": 0.6700593362662586,\n\
\ \"mc2_stderr\": 0.014408380056133315\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.658832901812388,\n\
\ \"acc_stderr\": 0.004731324409133276,\n \"acc_norm\": 0.8544114718183629,\n\
\ \"acc_norm_stderr\": 0.003519724163310883\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.02629399585547494,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.02629399585547494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02628055093284806,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02628055093284806\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999998,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999998\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.03368762932259433,\n \"acc_norm\": 0.7341040462427746,\n\
\ \"acc_norm_stderr\": 0.03368762932259433\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7531914893617021,\n \"acc_stderr\": 0.028185441301234095,\n\
\ \"acc_norm\": 0.7531914893617021,\n \"acc_norm_stderr\": 0.028185441301234095\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7248677248677249,\n \"acc_stderr\": 0.023000086859068652,\n \"\
acc_norm\": 0.7248677248677249,\n \"acc_norm_stderr\": 0.023000086859068652\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n\
\ \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199488,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476434,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.0198801654065888,\n \
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.0198801654065888\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.030384169232350825,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.030384169232350825\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.023793353997528802,\n\
\ \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.023793353997528802\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"\
acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230439,\n \"\
acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230439\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.01886951464665893,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.01886951464665893\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375853,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002159,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002159\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253858,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253858\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.010648356301876345,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.010648356301876345\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.02038322955113501,\n\
\ \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.02038322955113501\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7687150837988826,\n\
\ \"acc_stderr\": 0.01410222362315259,\n \"acc_norm\": 0.7687150837988826,\n\
\ \"acc_norm_stderr\": 0.01410222362315259\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043714,\n\
\ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043714\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8549382716049383,\n \"acc_stderr\": 0.019594877019727952,\n\
\ \"acc_norm\": 0.8549382716049383,\n \"acc_norm_stderr\": 0.019594877019727952\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5782268578878749,\n\
\ \"acc_stderr\": 0.012612974369390984,\n \"acc_norm\": 0.5782268578878749,\n\
\ \"acc_norm_stderr\": 0.012612974369390984\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.803921568627451,\n \"acc_stderr\": 0.01606205642196863,\n \
\ \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.01606205642196863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824636,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.017502858577371258,\n \"mc2\": 0.6700593362662586,\n\
\ \"mc2_stderr\": 0.014408380056133315\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6694465504169825,\n \
\ \"acc_stderr\": 0.012957496367085026\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/60B_MoE_Coder_v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|arc:challenge|25_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|gsm8k|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hellaswag|10_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-01-02.016455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T04-01-02.016455.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- '**/details_harness|winogrande|5_2024-02-10T04-01-02.016455.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T04-01-02.016455.parquet'
- config_name: results
data_files:
- split: 2024_02_10T04_01_02.016455
path:
- results_2024-02-10T04-01-02.016455.parquet
- split: latest
path:
- results_2024-02-10T04-01-02.016455.parquet
---
# Dataset Card for Evaluation run of cloudyu/60B_MoE_Coder_v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/60B_MoE_Coder_v3](https://huggingface.co/cloudyu/60B_MoE_Coder_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__60B_MoE_Coder_v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T04:01:02.016455](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__60B_MoE_Coder_v3/blob/main/results_2024-02-10T04-01-02.016455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7507692268285616,
"acc_stderr": 0.028851716073379694,
"acc_norm": 0.7546718582183198,
"acc_norm_stderr": 0.029402819641764666,
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371258,
"mc2": 0.6700593362662586,
"mc2_stderr": 0.014408380056133315
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428175
},
"harness|hellaswag|10": {
"acc": 0.658832901812388,
"acc_stderr": 0.004731324409133276,
"acc_norm": 0.8544114718183629,
"acc_norm_stderr": 0.003519724163310883
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.02629399585547494,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.02629399585547494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284806,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284806
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999998,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999998
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259433,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259433
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7531914893617021,
"acc_stderr": 0.028185441301234095,
"acc_norm": 0.7531914893617021,
"acc_norm_stderr": 0.028185441301234095
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7248677248677249,
"acc_stderr": 0.023000086859068652,
"acc_norm": 0.7248677248677249,
"acc_norm_stderr": 0.023000086859068652
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199488,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476434,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.0198801654065888,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.0198801654065888
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.030384169232350825,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.030384169232350825
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.023793353997528802,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.023793353997528802
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230439,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230439
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.01886951464665893,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.01886951464665893
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375853,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002159,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002159
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253858,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253858
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876345,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.02038322955113501,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.02038322955113501
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7687150837988826,
"acc_stderr": 0.01410222362315259,
"acc_norm": 0.7687150837988826,
"acc_norm_stderr": 0.01410222362315259
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043714,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043714
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8549382716049383,
"acc_stderr": 0.019594877019727952,
"acc_norm": 0.8549382716049383,
"acc_norm_stderr": 0.019594877019727952
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5782268578878749,
"acc_stderr": 0.012612974369390984,
"acc_norm": 0.5782268578878749,
"acc_norm_stderr": 0.012612974369390984
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.01606205642196863,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.01606205642196863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824636,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371258,
"mc2": 0.6700593362662586,
"mc2_stderr": 0.014408380056133315
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498428
},
"harness|gsm8k|5": {
"acc": 0.6694465504169825,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maywell/LogicKor | ---
license: cc-by-sa-4.0
---
# **LogicKor (한국어 언어모델 다분야 사고력 벤치마크)**
## 개요
LogicKor는 한국어 언어모델 다양한 분야에서의 사고력을 측정하기위해 구성된 LLM-as-a-judge 방식의 멀티턴 벤치마크 데이터셋입니다. 본 데이터셋은 6가지(추론, 수학, 글쓰기, 코딩, 이해, 국어)의 카테고리의 멀티턴 프롬프트 총 42개로 구성되어있습니다.
추론 및 평가 코드는 [StableFluffy/LogicKor](https://github.com/StableFluffy/LogicKor) 저장소를 참고해주세요.
## 각 카테고리별 세부정보
1. 추론 (Reasoning) - 논리적 사고, 문제 해결
2. 수학 (Math) - 수학적 개념, 계산
3. 글쓰기 (Writing) - 문장간의 호응, 창의력
4. 코딩 (Coding) - 코딩 지식, 기능 구현
5. 이해 (Understanding) - 지문 이해, 정보 추출, 지시 이행
6. 문법 (Grammar) - 한글 맞춤법, 표준 발음법
### Contact
- [Discord Server Link](https://discord.gg/MrBt3PXdXc) |
Sina-Alinejad-2002/trans_operation_prediction | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7933425
num_examples: 7341
- name: validation
num_bytes: 956113
num_examples: 873
download_size: 5639763
dataset_size: 8889538
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
chricht/tr-tr-sql | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 91827.0544896393
num_examples: 1172
- name: validation
num_bytes: 10263.945510360705
num_examples: 131
download_size: 41474
dataset_size: 102091.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
jlbaker361/prior-hot-50 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: man
dtype: image
- name: woman
dtype: image
- name: boy
dtype: image
- name: girl
dtype: image
- name: character
dtype: image
- name: person
dtype: image
splits:
- name: train
num_bytes: 52839687.0
num_examples: 21
download_size: 52846557
dataset_size: 52839687.0
---
flavor: hot
num_inference_steps: 50
|
akoukas/chatgpt-detector-bias | ---
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
- name: ai_generated
dtype: bool
- name: is_ai_generated
dtype: int64
splits:
- name: train
num_bytes: 3815051
num_examples: 4053
download_size: 1955262
dataset_size: 3815051
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8 | ---
pretty_name: Evaluation run of bobofrut/ladybird-base-7B-v8
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bobofrut/ladybird-base-7B-v8](https://huggingface.co/bobofrut/ladybird-base-7B-v8)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:12:42.138797](https://huggingface.co/datasets/open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8/blob/main/results_2024-03-24T15-12-42.138797.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506625889649847,\n\
\ \"acc_stderr\": 0.032104707295835706,\n \"acc_norm\": 0.6496279432262827,\n\
\ \"acc_norm_stderr\": 0.032782725320370666,\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7681803642917189,\n\
\ \"mc2_stderr\": 0.013953379615259215\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136438\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7170882294363673,\n\
\ \"acc_stderr\": 0.004494934025462338,\n \"acc_norm\": 0.8918542123083051,\n\
\ \"acc_norm_stderr\": 0.0030992974183235464\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634611,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634611\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7681803642917189,\n\
\ \"mc2_stderr\": 0.013953379615259215\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250677\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.012579398235589526\n }\n}\n```"
repo_url: https://huggingface.co/bobofrut/ladybird-base-7B-v8
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-12-42.138797.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-12-42.138797.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- '**/details_harness|winogrande|5_2024-03-24T15-12-42.138797.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-12-42.138797.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_12_42.138797
path:
- results_2024-03-24T15-12-42.138797.parquet
- split: latest
path:
- results_2024-03-24T15-12-42.138797.parquet
---
# Dataset Card for Evaluation run of bobofrut/ladybird-base-7B-v8
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bobofrut/ladybird-base-7B-v8](https://huggingface.co/bobofrut/ladybird-base-7B-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:12:42.138797](https://huggingface.co/datasets/open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8/blob/main/results_2024-03-24T15-12-42.138797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506625889649847,
"acc_stderr": 0.032104707295835706,
"acc_norm": 0.6496279432262827,
"acc_norm_stderr": 0.032782725320370666,
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7681803642917189,
"mc2_stderr": 0.013953379615259215
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136438
},
"harness|hellaswag|10": {
"acc": 0.7170882294363673,
"acc_stderr": 0.004494934025462338,
"acc_norm": 0.8918542123083051,
"acc_norm_stderr": 0.0030992974183235464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634611,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634611
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7681803642917189,
"mc2_stderr": 0.013953379615259215
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250677
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589526
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LucasThil/miniwobplusplus_episodes | ---
dataset_info:
features:
- name: episodes
dtype: string
- name: actions
dtype: string
splits:
- name: train
num_bytes: 3384285009
num_examples: 16794
download_size: 276652178
dataset_size: 3384285009
---
# Dataset Card for "miniwobplusplus_episodes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_indic-hi_wikiquote | ---
language: hi
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-hi_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
CortexLM/gpt-4-dataset | ---
license: unknown
---
|
nryn21/interior | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_null_genitive | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 344602
num_examples: 1928
- name: test
num_bytes: 3294429
num_examples: 18748
- name: train
num_bytes: 3074304
num_examples: 17249
download_size: 4181289
dataset_size: 6713335
---
# Dataset Card for "MULTI_VALUE_qqp_null_genitive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgallouedec/prj_gia_dataset_metaworld_push_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the push-v2 environment, sample for the policy push-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_push_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_push_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
Aditya8005/Road_construction_recommendation | ---
license: apache-2.0
---
|
tfnn/Plyverse-1.0 | ---
tags:
- objaverse
- plyverse
- model
- mesh
- asset
- 3d
- object
pretty_name: Plyverse 1.0
---
# Plyverse 1.0
This is [AllenAI Objverse 1.0](https://huggingface.co/datasets/allenai/objaverse) but stripped of all textures and materials and baked into solid meshes and exported as triangulated PLY files.
The models are all scaled to a unit sphere scale which is a normalised cubic scale then scaled again by 0.55 so that the cubic normal fits within the unit sphere.
The benefits to doing this is reduced file size and memory usage. Can be used to teach neural networks the shapes of objects to then be fed into a re-coloring network.
**This is work-in-progress (WIP)** and I might not even finish it because the process is a little slow, the script I am using to convert the GLB files is a Blender script in Python that seems to leak memory in the second phase for some reason, you can check it out [here](https://huggingface.co/datasets/tfnn/Plyverse-1.0/blob/main/glb_to_ply.py).
Even if it didn't leak... It's still really slow and takes around 48 hours to do a single section, in total there are 160 sections with 5,000 models per section so I will probably just convert the first two sections making a total of 10,000 models.
Refer to the original [AllenAI Objverse 1.0 dataset here](https://huggingface.co/datasets/allenai/objaverse) for the meta-data or you can downloads just the meta-data for PLY-000-000.7z and PLY-000-001.7z [here](https://huggingface.co/datasets/tfnn/Plyverse-1.0/resolve/main/plyverse-meta-data.7z?download=true).
# Sections
- [PLY-000-000](https://huggingface.co/datasets/tfnn/Plyverse-1.0/resolve/main/PLY-000-000.7z?download=true)
- [PLY-000-001](https://huggingface.co/datasets/tfnn/Plyverse-1.0/resolve/main/PLY-000-001.7z?download=true)
Possible licenses of these models, as quoted from [AllenAI](https://allenai.org/):
- [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/)
- [CC-BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/)
- [CC-BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
- [CC-BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
- [CC0 1.0](https://creativecommons.org/publicdomain/zero/1.0/) |
Nazarko/2D_GPS_Accelerometer | ---
license: unknown
---
|
davanstrien/dataset_readmes | ---
dataset_info:
features:
- name: author
dtype: string
- name: cardData
dtype: 'null'
- name: citation
dtype: string
- name: description
dtype: string
- name: disabled
dtype: bool
- name: downloads
dtype: float64
- name: gated
dtype: bool
- name: id
dtype: string
- name: lastModified
dtype: string
- name: paperswithcode_id
dtype: string
- name: private
dtype: bool
- name: sha
dtype: string
- name: siblings
sequence: 'null'
- name: tags
sequence: string
- name: readme_url
dtype: string
- name: readme
dtype: string
splits:
- name: train
num_bytes: 30248502
num_examples: 7356
download_size: 9717727
dataset_size: 30248502
---
# Dataset Card for "dataset_readmes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ugshanyu/1millionMongolianSentence | ---
license: apache-2.0
task_categories:
- translation
language:
- mn
- en
pretty_name: HelloWorld
--- |
CyberHarem/gravel_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gravel/グラベル/砾 (Arknights)
This is the dataset of gravel/グラベル/砾 (Arknights), containing 397 images and their tags.
The core tags of this character are `animal_ears, pink_hair, long_hair, animal_ear_fluff, hair_between_eyes, breasts, brown_eyes, tail, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 397 | 607.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gravel_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 397 | 509.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gravel_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 990 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gravel_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gravel_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, ahoge, barcode_tattoo, bare_shoulders, breastplate, off_shoulder, simple_background, smile, solo, white_background, black_jacket, blush, bright_pupils, closed_mouth, looking_at_viewer, open_jacket, shoulder_tattoo, upper_body, from_side, long_sleeves, pink_eyes |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_jacket, breastplate, looking_at_viewer, off_shoulder, open_jacket, simple_background, solo, upper_body, white_background, blush, long_sleeves, shirt, earpiece, headset, open_mouth, bright_pupils, infection_monitor_(arknights), :d, sleeveless |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_footwear, black_jacket, breastplate, long_sleeves, looking_at_viewer, off_shoulder, open_jacket, simple_background, solo, brown_thighhighs, high_heel_boots, smile, white_background, barcode_tattoo, blush, parted_lips, shoulder_tattoo, sitting, black_thighhighs, knee_boots, torn_thighhighs, covered_navel, red_eyes, thighs, white_leotard |
| 3 | 8 |  |  |  |  |  | 1girl, blush, large_breasts, navel, nipples, pussy, solo, looking_at_viewer, completely_nude, open_mouth, pillow, smile, spread_legs, on_bed, uncensored, anus, on_back, pink_eyes, barcode_tattoo, indoors, red_eyes, sweat, thighs |
| 4 | 22 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, sex, vaginal, large_breasts, penis, solo_focus, spread_legs, navel, cum_in_pussy, completely_nude, looking_at_viewer, barcode_tattoo, girl_on_top, open_mouth, smile, tongue_out, uncensored, cowgirl_position, cum_overflow, pov, sweat, dark_skin, heart-shaped_pupils, lactation, shoulder_tattoo, indoors, mouse_ears, mouse_girl |
| 5 | 24 |  |  |  |  |  | 1girl, black_headwear, black_shirt, long_sleeves, looking_at_viewer, official_alternate_costume, solo, black_gloves, black_shorts, high-waist_shorts, smile, necklace, cowboy_shot, white_cape, medium_breasts, thigh_strap, cabbie_hat, belt, blush, simple_background, white_background, bright_pupils, orange_eyes, holding, large_breasts, parted_lips |
| 6 | 8 |  |  |  |  |  | 1girl, black_footwear, black_headwear, black_shirt, black_shorts, long_sleeves, looking_at_viewer, official_alternate_costume, smile, solo, thigh_strap, white_cape, black_gloves, high-waist_shorts, kneehighs, orange_eyes, black_socks, high_heels, necklace, simple_background, belt, dark-skinned_female, full_body, medium_breasts, white_background, sitting, cabbie_hat, character_name, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ahoge | barcode_tattoo | bare_shoulders | breastplate | off_shoulder | simple_background | smile | solo | white_background | black_jacket | blush | bright_pupils | closed_mouth | looking_at_viewer | open_jacket | shoulder_tattoo | upper_body | from_side | long_sleeves | pink_eyes | shirt | earpiece | headset | open_mouth | infection_monitor_(arknights) | :d | sleeveless | black_footwear | brown_thighhighs | high_heel_boots | parted_lips | sitting | black_thighhighs | knee_boots | torn_thighhighs | covered_navel | red_eyes | thighs | white_leotard | large_breasts | navel | nipples | pussy | completely_nude | pillow | spread_legs | on_bed | uncensored | anus | on_back | indoors | sweat | 1boy | hetero | sex | vaginal | penis | solo_focus | cum_in_pussy | girl_on_top | tongue_out | cowgirl_position | cum_overflow | pov | dark_skin | heart-shaped_pupils | lactation | mouse_ears | mouse_girl | black_headwear | black_shirt | official_alternate_costume | black_gloves | black_shorts | high-waist_shorts | necklace | cowboy_shot | white_cape | medium_breasts | thigh_strap | cabbie_hat | belt | orange_eyes | holding | kneehighs | black_socks | high_heels | dark-skinned_female | full_body | character_name | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:-----------------|:--------------|:---------------|:--------------------|:--------|:-------|:-------------------|:---------------|:--------|:----------------|:---------------|:--------------------|:--------------|:------------------|:-------------|:------------|:---------------|:------------|:--------|:-----------|:----------|:-------------|:--------------------------------|:-----|:-------------|:-----------------|:-------------------|:------------------|:--------------|:----------|:-------------------|:-------------|:------------------|:----------------|:-----------|:---------|:----------------|:----------------|:--------|:----------|:--------|:------------------|:---------|:--------------|:---------|:-------------|:-------|:----------|:----------|:--------|:-------|:---------|:------|:----------|:--------|:-------------|:---------------|:--------------|:-------------|:-------------------|:---------------|:------|:------------|:----------------------|:------------|:-------------|:-------------|:-----------------|:--------------|:-----------------------------|:---------------|:---------------|:--------------------|:-----------|:--------------|:-------------|:-----------------|:--------------|:-------------|:-------|:--------------|:----------|:------------|:--------------|:-------------|:----------------------|:------------|:-----------------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | X | X | X | X | | X | X | X | X | X | | X | X | | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | X | X | | | X | X | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | | | | | X | X | | | X | | | X | | | | | | X | | | | X | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 22 |  |  |  |  |  | X | | X | | | | | X | | | | X | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 24 |  |  |  |  |  | X | | | | | | X | X | X | X | | X | X | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | | | X | X | X | X | | | | | X | | | | | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | X | X | X | X | X | X | | X | X | X | X | X | X | X |
|
MicPie/unpredictable_cluster05 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster05
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster05" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
NumbersStation/NSText2SQL | ---
language:
- en
task_categories:
- text2text-generation
license:
- other
language_creators:
- crowdsourced
- expert-generated
multilinguality:
- multilingual
tags:
- text-to-sql
size_categories:
- 100K<n<1M
pretty_name: NSText2SQL
---
# Dataset Summary
NSText2SQL dataset used to train [NSQL](https://huggingface.co/NumbersStation/nsql-6B) models. The data is curated from more than 20 different public sources across the web with permissable licenses (listed below). All of these datasets come with existing text-to-SQL pairs. We apply various data cleaning and pre-processing techniques including table schema augmentation, SQL cleaning, and instruction generation using existing LLMs. The resulting dataset contains around 290,000 samples of text-to-SQL pairs.
For more information and code, please see [this repository](https://github.com/NumbersStationAI/NSQL).
# How to use it
```python
from datasets import load_dataset
dataset = load_dataset("NumbersStation/NSText2SQL")
```
# Dataset Structure
## Data Instances
Each data instance in this dataset represents a text-to-SQL entry where the instruction has been formatted with the table schema and question. The output is the SQL in SQlite dialect.
## Data Fields
- `instruction` (string): the instruction to generate SQL.
- `output` (string): the ground truth SQL.
- `source` (string): the source dataset of the sample.
# Languages
The language of the data is primarily English.
# Source Data and Licensing Information
NSText2SQL is sourced from repositories with various licenses. Any use of all or part of the data gathered in NSText2SQL must abide by the terms of the original licenses, including attribution clauses when relevant. We thank all authors who provided these datasets. We provide provenance information for each dataset below.
| Datasets | License | Link |
| ---------------------- | ------------ | -------------------------------------------------------------------------------------------------------------------- |
| academic | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| advising | CC-BY-4.0 | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| atis | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| restaurants | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| scholar | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| imdb | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| yelp | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| criteria2sql | Apache-2.0 | [https://github.com/xiaojingyu92/Criteria2SQL](https://github.com/xiaojingyu92/Criteria2SQL) |
| css | CC-BY-4.0 | [https://huggingface.co/datasets/zhanghanchong/css](https://huggingface.co/datasets/zhanghanchong/css) |
| eICU | CC-BY-4.0 | [https://github.com/glee4810/EHRSQL](https://github.com/glee4810/EHRSQL) |
| mimic_iii | CC-BY-4.0 | [https://github.com/glee4810/EHRSQL](https://github.com/glee4810/EHRSQL) |
| geonucleardata | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| greatermanchestercrime | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| studentmathscore | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| thehistoryofbaseball | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| uswildfires | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| whatcdhiphop | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| worldsoccerdatabase | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| pesticide | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| mimicsql_data | MIT | [https://github.com/wangpinggl/TREQS](https://github.com/wangpinggl/TREQS) |
| nvbench | MIT | [https://github.com/TsinghuaDatabaseGroup/nvBench](https://github.com/TsinghuaDatabaseGroup/nvBench) |
| sede | Apache-2.0 | [https://github.com/hirupert/sede](https://github.com/hirupert/sede) |
| spider | CC-BY-SA-4.0 | [https://huggingface.co/datasets/spider](https://huggingface.co/datasets/spider) |
| sql_create_context | CC-BY-4.0 | [https://huggingface.co/datasets/b-mc2/sql-create-context](https://huggingface.co/datasets/b-mc2/sql-create-context) |
| squall | CC-BY-SA-4.0 | [https://github.com/tzshi/squall](https://github.com/tzshi/squall) |
| wikisql | BSD 3-Clause | [https://github.com/salesforce/WikiSQL](https://github.com/salesforce/WikiSQL) |
# Citing this work
If you use this data in your work, please cite our work _and_ the appropriate original sources:
To cite NSText2SQL, please use:
```TeX
@software{numbersstation2023NSText2SQL,
author = {Numbers Station Labs},
title = {NSText2SQL: An Open Source Text-to-SQL Dataset for Foundation Model Training},
month = {July},
year = {2023},
url = {https://github.com/NumbersStationAI/NSQL},
}
```
To cite dataset used in this work, please use:
| Datasets | Cite |
| ---------------------- | ---------------------------------------------------------------------------------------- |
| academic | `\cite{data-advising,data-academic}` |
| advising | `\cite{data-advising}` |
| atis | `\cite{data-advising,data-atis-original,data-atis-geography-scholar}` |
| restaurants | `\cite{data-advising,data-restaurants-logic,data-restaurants-original,data-restaurants}` |
| scholar | `\cite{data-advising,data-atis-geography-scholar}` |
| imdb | `\cite{data-advising,data-imdb-yelp}` |
| yelp | `\cite{data-advising,data-imdb-yelp}` |
| criteria2sql | `\cite{Criteria-to-SQL}` |
| css | `\cite{zhang2023css}` |
| eICU | `\cite{lee2022ehrsql}` |
| mimic_iii | `\cite{lee2022ehrsql}` |
| geonucleardata | `\cite{lee-2021-kaggle-dbqa}` |
| greatermanchestercrime | `\cite{lee-2021-kaggle-dbqa}` |
| studentmathscore | `\cite{lee-2021-kaggle-dbqa}` |
| thehistoryofbaseball | `\cite{lee-2021-kaggle-dbqa}` |
| uswildfires | `\cite{lee-2021-kaggle-dbqa}` |
| whatcdhiphop | `\cite{lee-2021-kaggle-dbqa}` |
| worldsoccerdatabase | `\cite{lee-2021-kaggle-dbqa}` |
| pesticide | `\cite{lee-2021-kaggle-dbqa}` |
| mimicsql_data | `\cite{wang2020text}` |
| nvbench | `\cite{nvBench_SIGMOD21}` |
| sede | `\cite{hazoom2021text}` |
| spider | `\cite{data-spider}` |
| sql_create_context | `\cite{b-mc2_2023_sql-create-context}` |
| squall | `\cite{squall}` |
| wikisql | `\cite{data-wikisql}` |
```TeX
@InProceedings{data-advising,
dataset = {Advising},
author = {Catherine Finegan-Dollak, Jonathan K. Kummerfeld, Li Zhang, Karthik Ramanathan, Sesh Sadasivam, Rui Zhang, and Dragomir Radev},
title = {Improving Text-to-SQL Evaluation Methodology},
booktitle = {Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
month = {July},
year = {2018},
location = {Melbourne, Victoria, Australia},
pages = {351--360},
url = {http://aclweb.org/anthology/P18-1033},
}
@InProceedings{data-imdb-yelp,
dataset = {IMDB and Yelp},
author = {Navid Yaghmazadeh, Yuepeng Wang, Isil Dillig, and Thomas Dillig},
title = {SQLizer: Query Synthesis from Natural Language},
booktitle = {International Conference on Object-Oriented Programming, Systems, Languages, and Applications, ACM},
month = {October},
year = {2017},
pages = {63:1--63:26},
url = {http://doi.org/10.1145/3133887},
}
@article{data-academic,
dataset = {Academic},
author = {Fei Li and H. V. Jagadish},
title = {Constructing an Interactive Natural Language Interface for Relational Databases},
journal = {Proceedings of the VLDB Endowment},
volume = {8},
number = {1},
month = {September},
year = {2014},
pages = {73--84},
url = {http://dx.doi.org/10.14778/2735461.2735468},
}
@InProceedings{data-atis-geography-scholar,
dataset = {Scholar, and Updated ATIS and Geography},
author = {Srinivasan Iyer, Ioannis Konstas, Alvin Cheung, Jayant Krishnamurthy, and Luke Zettlemoyer},
title = {Learning a Neural Semantic Parser from User Feedback},
booktitle = {Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
year = {2017},
pages = {963--973},
location = {Vancouver, Canada},
url = {http://www.aclweb.org/anthology/P17-1089},
}
@article{data-atis-original,
dataset = {ATIS, original},
author = {Deborah A. Dahl, Madeleine Bates, Michael Brown, William Fisher, Kate Hunicke-Smith, David Pallett, Christine Pao, Alexander Rudnicky, and Elizabeth Shriber},
title = {{Expanding the scope of the ATIS task: The ATIS-3 corpus}},
journal = {Proceedings of the workshop on Human Language Technology},
year = {1994},
pages = {43--48},
url = {http://dl.acm.org/citation.cfm?id=1075823},
}
@inproceedings{data-restaurants-logic,
author = {Lappoon R. Tang and Raymond J. Mooney},
title = {Automated Construction of Database Interfaces: Intergrating Statistical and Relational Learning for Semantic Parsing},
booktitle = {2000 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora},
year = {2000},
pages = {133--141},
location = {Hong Kong, China},
url = {http://www.aclweb.org/anthology/W00-1317},
}
@inproceedings{data-restaurants-original,
author = {Ana-Maria Popescu, Oren Etzioni, and Henry Kautz},
title = {Towards a Theory of Natural Language Interfaces to Databases},
booktitle = {Proceedings of the 8th International Conference on Intelligent User Interfaces},
year = {2003},
location = {Miami, Florida, USA},
pages = {149--157},
url = {http://doi.acm.org/10.1145/604045.604070},
}
@inproceedings{data-restaurants,
author = {Alessandra Giordani and Alessandro Moschitti},
title = {Automatic Generation and Reranking of SQL-derived Answers to NL Questions},
booktitle = {Proceedings of the Second International Conference on Trustworthy Eternal Systems via Evolving Software, Data and Knowledge},
year = {2012},
location = {Montpellier, France},
pages = {59--76},
url = {https://doi.org/10.1007/978-3-642-45260-4_5},
}
@InProceedings{data-spider,
author = {Tao Yu, Rui Zhang, Kai Yang, Michihiro Yasunaga, Dongxu Wang, Zifan Li, James Ma, Irene Li, Qingning Yao, Shanelle Roman, Zilin Zhang, and Dragomir Radev},
title = {Spider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-SQL Task},
booktitle = {Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing},
year = {2018},
location = {Brussels, Belgium},
pages = {3911--3921},
url = {http://aclweb.org/anthology/D18-1425},
}
@article{data-wikisql,
author = {Victor Zhong, Caiming Xiong, and Richard Socher},
title = {Seq2SQL: Generating Structured Queries from Natural Language using Reinforcement Learning},
year = {2017},
journal = {CoRR},
volume = {abs/1709.00103},
}
@InProceedings{Criteria-to-SQL,
author = {Yu, Xiaojing and Chen, Tianlong and Yu, Zhengjie and Li, Huiyu and Yang, Yang and Jiang, Xiaoqian and Jiang, Anxiao},
title = {Dataset and Enhanced Model for Eligibility Criteria-to-SQL Semantic Parsing},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {5831--5839},
}
@misc{zhang2023css,
title = {CSS: A Large-scale Cross-schema Chinese Text-to-SQL Medical Dataset},
author = {Hanchong Zhang and Jieyu Li and Lu Chen and Ruisheng Cao and Yunyan Zhang and Yu Huang and Yefeng Zheng and Kai Yu},
year = {2023},
}
@article{lee2022ehrsql,
title = {EHRSQL: A Practical Text-to-SQL Benchmark for Electronic Health Records},
author = {Lee, Gyubok and Hwang, Hyeonji and Bae, Seongsu and Kwon, Yeonsu and Shin, Woncheol and Yang, Seongjun and Seo, Minjoon and Kim, Jong-Yeup and Choi, Edward},
journal = {Advances in Neural Information Processing Systems},
volume = {35},
pages = {15589--15601},
year = {2022},
}
@inproceedings{lee-2021-kaggle-dbqa,
title = {KaggleDBQA: Realistic Evaluation of Text-to-SQL Parsers},
author = {Lee, Chia-Hsuan and Polozov, Oleksandr and Richardson, Matthew},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)},
pages = {2261--2273},
year = {2021},
}
@inproceedings{squall,
title = {On the Potential of Lexico-logical Alignments for Semantic Parsing to {SQL} Queries},
author = {Tianze Shi and Chen Zhao and Jordan Boyd-Graber and Hal {Daum\'{e} III} and Lillian Lee},
booktitle = {Findings of EMNLP},
year = {2020},
}
@article{hazoom2021text,
title = {Text-to-SQL in the wild: a naturally-occurring dataset based on Stack exchange data},
author = {Hazoom, Moshe and Malik, Vibhor and Bogin, Ben},
journal = {arXiv preprint arXiv:2106.05006},
year = {2021},
}
@inproceedings{wang2020text,
title = {Text-to-SQL Generation for Question Answering on Electronic Medical Records},
author = {Wang, Ping and Shi, Tian and Reddy, Chandan K},
booktitle = {Proceedings of The Web Conference 2020},
pages = {350--361},
year = {2020},
}
@inproceedings{nvBench_SIGMOD21,
title = {Synthesizing Natural Language to Visualization (NL2VIS) Benchmarks from NL2SQL Benchmarks},
author = {Yuyu Luo and Nan Tang and Guoliang Li and Chengliang Chai and Wenbo Li and Xuedi Qin},
booktitle = {Proceedings of the 2021 International Conference on Management of Data, {SIGMOD} Conference 2021, June 20–25, 2021, Virtual Event, China},
publisher = {ACM},
year = {2021},
}
@misc{b-mc2_2023_sql-create-context,
title = {sql-create-context Dataset},
author = {b-mc2},
year = {2023},
url = {https://huggingface.co/datasets/b-mc2/sql-create-context},
note = {This dataset was created by modifying data from the following sources: \cite{zhongSeq2SQL2017, yu2018spider}.},
}
``` |
open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat | ---
pretty_name: Evaluation run of lole25/phi-2-sft-lora-ultrachat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lole25/phi-2-sft-lora-ultrachat](https://huggingface.co/lole25/phi-2-sft-lora-ultrachat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T01:43:41.244095](https://huggingface.co/datasets/open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat/blob/main/results_2024-03-13T01-43-41.244095.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5746481887113103,\n\
\ \"acc_stderr\": 0.0338594429587652,\n \"acc_norm\": 0.5763021072645076,\n\
\ \"acc_norm_stderr\": 0.03455197750653827,\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.01627228795791691,\n \"mc2\": 0.45459342607308556,\n\
\ \"mc2_stderr\": 0.015099933427250509\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326021,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5587532364070902,\n\
\ \"acc_stderr\": 0.0049552127878323814,\n \"acc_norm\": 0.7485560645289783,\n\
\ \"acc_norm_stderr\": 0.004329565016527315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.040089737857792046,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.040089737857792046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936338,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936338\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415898,\n \"\
acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415898\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n\
\ \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.6871008939974457,\n\
\ \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702334,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336458,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336458\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.02947525023601718,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.02947525023601718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.01627228795791691,\n \"mc2\": 0.45459342607308556,\n\
\ \"mc2_stderr\": 0.015099933427250509\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972387\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.533737680060652,\n \
\ \"acc_stderr\": 0.013741096412226756\n }\n}\n```"
repo_url: https://huggingface.co/lole25/phi-2-sft-lora-ultrachat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|arc:challenge|25_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|gsm8k|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hellaswag|10_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-43-41.244095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T01-43-41.244095.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- '**/details_harness|winogrande|5_2024-03-13T01-43-41.244095.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T01-43-41.244095.parquet'
- config_name: results
data_files:
- split: 2024_03_13T01_43_41.244095
path:
- results_2024-03-13T01-43-41.244095.parquet
- split: latest
path:
- results_2024-03-13T01-43-41.244095.parquet
---
# Dataset Card for Evaluation run of lole25/phi-2-sft-lora-ultrachat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lole25/phi-2-sft-lora-ultrachat](https://huggingface.co/lole25/phi-2-sft-lora-ultrachat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T01:43:41.244095](https://huggingface.co/datasets/open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat/blob/main/results_2024-03-13T01-43-41.244095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5746481887113103,
"acc_stderr": 0.0338594429587652,
"acc_norm": 0.5763021072645076,
"acc_norm_stderr": 0.03455197750653827,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791691,
"mc2": 0.45459342607308556,
"mc2_stderr": 0.015099933427250509
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326021,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.5587532364070902,
"acc_stderr": 0.0049552127878323814,
"acc_norm": 0.7485560645289783,
"acc_norm_stderr": 0.004329565016527315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936338,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936338
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415898,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6871008939974457,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.6871008939974457,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702334,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336458,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336458
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.02947525023601718,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.02947525023601718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.01627228795791691,
"mc2": 0.45459342607308556,
"mc2_stderr": 0.015099933427250509
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972387
},
"harness|gsm8k|5": {
"acc": 0.533737680060652,
"acc_stderr": 0.013741096412226756
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_1712974877 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2531644
num_examples: 8685
download_size: 1321316
dataset_size: 2531644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
coref-data/korean_ecmt_raw | ---
license: cc-by-nc-sa-4.0
---
# Korean Effective Crowdsourcing of Multiple Tasks (ECMT) for Comprehensive Knowledge Extraction
- Project: https://github.com/machinereading/crowdsourcing
- Data source: https://figshare.com/s/7367aeca244efae03068
## Details
Annotated text from Korean Wikipedia and KBox (Korean DBpedia). Includes a crowd sourced training set and expert annotated (reviewed by four experts) test set.
The dataset was annotated by crowdworks in multiple stages.
* Phase I: entity mention detection annotation; candidate entity mentions are selected in a text
* Phase II: entity linking annotation; candidate mentions can be linked to a knowledge base
* Phase III: coreference annotation; entities can be linked to pronouns, demonstrative determiners, and antecedent mentions
* Phase IV: relation extraction annotation; relations between entities are annotated
### Annotation Notes
#### Phase I
* For each mention, the annotator selects a category from one of 16 options: person, study field, theory, artifact, organization, location, civilization, event, year, time, quantity, job, animal, plant, material, and term.
* Entities can be things, concepts, ideas, or events:
```
개체란 다른 것들과 분리되어 존재하는 것으로, 개체는 물질적 존재일 필요는 없으며 개념적 아이디어 혹은 사건도 될 수 있다 개체의 대표적인 범주에는 사람, 물체, 조직, 기관, 장소, 시간, 사건 등이 포함된다
```
* Compound nouns are tagged with the largest span:
```
복합명사인 경우 가장 넓은 단위로 태깅해주세요 ex) [상하이] [디즈니랜드] -> [상하이 디즈니랜드]
```
* Final result is created by merging annotations from two separate annotators.
#### Phase II
* For each mention, a list of candidates from the knowledge base are shown. The annotator can select a candidate, not in candidate list, or not an entity.
* Each document was annotated by a single annotator.
#### Phase III
* For each mention, the annotator can select a preceding mention, no antecedent, or error. Noun phrases and pronouns are extracted using the parse information.
* "We scaled down the coreference resolution by limiting the scope of the target mentions to a named entity, pronoun, and definite noun phrase."
* Postfixes particles (조사) are not included in the antecedent:
```
[작업대상] 아래 항목에서 조사등을 제외(교정)해 주세요. 그녀는 -> 그녀
```
## Citation
```
@inproceedings{nam-etal-2020-effective,
title = "Effective Crowdsourcing of Multiple Tasks for Comprehensive Knowledge Extraction",
author = "Nam, Sangha and
Lee, Minho and
Kim, Donghwan and
Han, Kijong and
Kim, Kuntae and
Yoon, Sooji and
Kim, Eun-kyung and
Choi, Key-Sun",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.27",
pages = "212--219",
abstract = "Information extraction from unstructured texts plays a vital role in the field of natural language processing. Although there has been extensive research into each information extraction task (i.e., entity linking, coreference resolution, and relation extraction), data are not available for a continuous and coherent evaluation of all information extraction tasks in a comprehensive framework. Given that each task is performed and evaluated with a different dataset, analyzing the effect of the previous task on the next task with a single dataset throughout the information extraction process is impossible. This paper aims to propose a Korean information extraction initiative point and promote research in this field by presenting crowdsourcing data collected for four information extraction tasks from the same corpus and the training and evaluation results for each task of a state-of-the-art model. These machine learning data for Korean information extraction are the first of their kind, and there are plans to continuously increase the data volume. The test results will serve as an initiative result for each Korean information extraction task and are expected to serve as a comparison target for various studies on Korean information extraction using the data collected in this study.",
language = "English",
ISBN = "979-10-95546-34-4",
}
``` |
qgallouedec/prj_gia_dataset_metaworld_door_unlock_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the door-unlock-v2 environment, sample for the policy door-unlock-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_door_unlock_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_door_unlock_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
CyberHarem/luna_child_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of luna_child/ルナチャイルド/루나차일드 (Touhou)
This is the dataset of luna_child/ルナチャイルド/루나차일드 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, drill_hair, hat, short_hair, wings, red_eyes, bow, fairy_wings, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 411.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 300.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1021 | 583.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 388.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1021 | 713.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luna_child_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/luna_child_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, open_mouth, solo, dress, chestnut_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, bangs, black_bowtie, looking_at_viewer, open_mouth, solo, white_dress, hair_between_eyes, long_sleeves, simple_background, blush, chestnut_mouth, puffy_sleeves, white_background, drill_locks, one-hour_drawing_challenge, upper_body, wide_sleeves |
| 2 | 9 |  |  |  |  |  | 2girls, dress, chestnut_mouth, open_mouth |
| 3 | 23 |  |  |  |  |  | loli, 1girl, nipples, nude, blush, solo, flat_chest, pussy, navel, open_mouth, chestnut_mouth |
| 4 | 14 |  |  |  |  |  | hetero, 1girl, loli, penis, solo_focus, 1boy, blush, flat_chest, nipples, sex, nude, open_mouth, vaginal, censored, cum_in_pussy, navel, tears, chestnut_mouth |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, loli, penis, solo_focus, censored, facial, fellatio, cum_on_body, flat_chest, nipples, nude, one_eye_closed |
| 6 | 5 |  |  |  |  |  | 1girl, loli, no_panties, pussy, solo, blush, dress_lift, peeing, navel, censored, squatting |
| 7 | 9 |  |  |  |  |  | 1girl, blush, flat_chest, loli, solo, nipples, topless, barefoot, white_panties, underwear_only |
| 8 | 6 |  |  |  |  |  | bangs, beret, blush, cowboy_shot, long_sleeves, pleated_skirt, sailor_collar, serafuku, white_panties, yellow_neckerchief, 1girl, bespectacled, plaid_skirt, solo, alternate_costume, indoors, miniskirt, standing, contemporary, grey_skirt, hair_between_eyes, looking_at_viewer, sleeves_past_wrists |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | open_mouth | solo | dress | chestnut_mouth | bangs | black_bowtie | looking_at_viewer | white_dress | hair_between_eyes | long_sleeves | simple_background | puffy_sleeves | white_background | drill_locks | one-hour_drawing_challenge | upper_body | wide_sleeves | 2girls | loli | nipples | nude | flat_chest | pussy | navel | hetero | penis | solo_focus | 1boy | sex | vaginal | censored | cum_in_pussy | tears | facial | fellatio | cum_on_body | one_eye_closed | no_panties | dress_lift | peeing | squatting | topless | barefoot | white_panties | underwear_only | beret | cowboy_shot | pleated_skirt | sailor_collar | serafuku | yellow_neckerchief | bespectacled | plaid_skirt | alternate_costume | indoors | miniskirt | standing | contemporary | grey_skirt | sleeves_past_wrists |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:-------|:--------|:-----------------|:--------|:---------------|:--------------------|:--------------|:--------------------|:---------------|:--------------------|:----------------|:-------------------|:--------------|:-----------------------------|:-------------|:---------------|:---------|:-------|:----------|:-------|:-------------|:--------|:--------|:---------|:--------|:-------------|:-------|:------|:----------|:-----------|:---------------|:--------|:---------|:-----------|:--------------|:-----------------|:-------------|:-------------|:---------|:------------|:----------|:-----------|:----------------|:-----------------|:--------|:--------------|:----------------|:----------------|:-----------|:---------------------|:---------------|:--------------|:--------------------|:----------|:------------|:-----------|:---------------|:-------------|:----------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | | | X | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 23 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | X | X | X | X | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | | | X | | | | X | X | | | | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | X | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sayan1101/new_sft_summarize | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1264287802
num_examples: 287113
- name: validation
num_bytes: 57852724
num_examples: 13368
- name: test
num_bytes: 50029142
num_examples: 11490
download_size: 801958229
dataset_size: 1372169668
---
# Dataset Card for "new_sft_summarize"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shokuhou_misaki_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shokuhou_misaki (To Aru Majutsu no Index)
This is the dataset of shokuhou_misaki (To Aru Majutsu no Index), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
PhanAnh/dao_finetune | ---
license: creativeml-openrail-m
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_156 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1178894748.0
num_examples: 231519
download_size: 1201999818
dataset_size: 1178894748.0
---
# Dataset Card for "chunk_156"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kossnocorp/wikipedia-words-en-low | ---
dataset_info:
features:
- name: word
dtype: string
- name: pos
dtype: string
- name: count
dtype: int64
- name: frequency
dtype: float64
splits:
- name: train
num_bytes: 101070565.27033667
num_examples: 2654652
download_size: 33850853
dataset_size: 101070565.27033667
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
newbia/gaming | ---
license: apache-2.0
---
|
CVasNLPExperiments/OxfordPets_test_google_flan_t5_xl_mode_C_A_T_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1405232
num_examples: 3669
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 2738330
num_examples: 3669
- name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 5401217
num_examples: 3669
- name: fewshot_5_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 8057254
num_examples: 3669
download_size: 2983338
dataset_size: 17602033
---
# Dataset Card for "OxfordPets_test_google_flan_t5_xl_mode_C_A_T_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fakermiya/nsfw-sfw | ---
license: gpl-3.0
---
|
liuyanchen1015/MULTI_VALUE_sst2_absolute_reflex | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4478
num_examples: 27
- name: test
num_bytes: 7330
num_examples: 50
- name: train
num_bytes: 109973
num_examples: 827
download_size: 56459
dataset_size: 121781
---
# Dataset Card for "MULTI_VALUE_sst2_absolute_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
forta/malicious-smart-contract-dataset | ---
license: mit
task_categories:
- token-classification
tags:
- smart contract
- ethereum
- blockchain
- security
pretty_name: Malicious Smart Contract Classification Dataset
size_categories:
- 100K<n<1M
---
# Malicious Smart Contract Classification Dataset
This dataset includes malicious and benign smart contracts deployed on Ethereum.
Code used to collect this data: [data collection notebook](https://github.com/forta-network/starter-kits/blob/main/malicious-smart-contract-ml-py/data_collection.ipynb)
For more details on how this dataset can be used, please check out this blog: [How Forta’s Predictive ML Models Detect Attacks Before Exploitation](https://forta.org/blog/how-fortas-predictive-ml-models-detect-attacks-before-exploitation/) |
legacy107/qa_wikipedia_augmented_sentence_transformer_negative_farming_128 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: Question
dtype: string
- name: Question_no
dtype: int64
- name: Rewrite
dtype: string
- name: true_page_title
dtype: string
- name: negatives
sequence: string
- name: positive
dtype: string
splits:
- name: train
num_bytes: 31497869
num_examples: 6000
- name: validation
num_bytes: 6130773
num_examples: 1183
download_size: 12202193
dataset_size: 37628642
---
# Dataset Card for "qa_wikipedia_augmented_sentence_transformer_negative_farming_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
schattin/fill1k | ---
license: mit
---
|
Phaedrus/CSAW_combined_264 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label1
dtype: image
- name: label2
dtype: image
- name: label3
dtype: image
- name: label4
dtype: image
- name: label5
dtype: image
- name: label6
dtype: image
- name: label7
dtype: image
- name: label8
dtype: image
- name: label9
dtype: image
- name: label10
dtype: image
- name: label11
dtype: image
- name: label12
dtype: image
splits:
- name: train
num_bytes: 3354389158.0
num_examples: 264
download_size: 154024684
dataset_size: 3354389158.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CSAW_combined_264"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aisuko/simple_english_wikipedia | ---
license: mit
language:
- en
---
Only for the reaseaching usage.
The original data from http://sbert.net/datasets/simplewiki-2020-11-01.jsonl.gz.
We use `nq_distilbert-base-v1` model encode all the data to the PyTorch Tensors. And `normalize` the embeddings by using `sentence_transformers.util.normalize_embeddings`.
## How to use
See notebook [Wikipedia Q&A Retrieval-Semantic Search](https://www.kaggle.com/code/aisuko/wikipedia-q-a-retrieval-semantic-search)
## Installing the package
```python
!pip install sentence-transformers==2.3.1
```
## The converting process
```python
# the whole process takes 1287.0s GPU P100
import os
import json
import gzip
from sentence_tranformers.util import http_get
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import normalize_embeddings
os.environ['DATASET_NAME']='simplewiki-2020-11-01.jsonl.gz'
os.environ['DATASET_URL']='http://sbert.net/datasets/simplewiki-2020-11-01.jsonl.gz'
os.environ['MODEL_NAME']='multi-qa-MiniLM-L6-cos-v1'
os.environ['CROSS_CODE_NAME']='cross-encoder/ms-marco-MiniLM-L-6-v2'
http_get(os.getenv('DATASET_URL'), os.getenv('DATASET_NAME'))
passages=[]
with gzip.open(os.getenv('DATASET_NAME'), 'rt', encoding='utf-8') as fIn:
for line in fIn:
data=json.loads(line.strip())
# add all paragraphs
# passages.extend(data['paragraphs'])
# only add the first paragraph
# passages.append(data['paragraph'][0])
for paragraph in data['paragraphs']:
# We encode the passages as [title, text]
passages.append([data['title'], paragraph])
print('Passages:', len(passages))
bi_encoder=SentenceTransformer('nq-distilbert-base-v1')
bi_encoder.max_seq_length=256
bi_encoder.to('cuda')
corpus_embeddings=bi_encoder.encode(passages, convert_to_tensor=True, show_progress_bar=True).to('cuda')
corpus_embeddings=normalize_embeddings(corpus_embeddings)
len(corpus_embeddings)
import pandas as pd
embedding_data=pd.DataFrame(corpus_embeddings.cpu())
embedding_data.to_csv('simple_english_wikipedia_2020_11_01.csv', index=False)
```
|
snow_simplified_japanese_corpus | ---
annotations_creators:
- crowdsourced
- other
language_creators:
- found
language:
- en
- ja
license:
- cc-by-4.0
multilinguality:
- translation
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: SNOW T15 and T23 (simplified Japanese corpus)
dataset_info:
- config_name: snow_t15
features:
- name: ID
dtype: string
- name: original_ja
dtype: string
- name: simplified_ja
dtype: string
- name: original_en
dtype: string
splits:
- name: train
num_bytes: 7218115
num_examples: 50000
download_size: 3634132
dataset_size: 7218115
- config_name: snow_t23
features:
- name: ID
dtype: string
- name: original_ja
dtype: string
- name: simplified_ja
dtype: string
- name: original_en
dtype: string
- name: proper_noun
dtype: string
splits:
- name: train
num_bytes: 6704695
num_examples: 34300
download_size: 3641507
dataset_size: 6704695
---
# Dataset Card for SNOW T15 and T23 (simplified Japanese corpus)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SNOW T15](http://www.jnlp.org/SNOW/T15), [SNOW T23](http://www.jnlp.org/SNOW/T23)
- **Repository:** [N/A]
- **Paper:** ["Simplified Corpus with Core Vocabulary"](https://www.aclweb.org/anthology/L18-1185), ["やさしい⽇本語対訳コーパスの構築"](https://www.anlp.jp/proceedings/annual_meeting/2017/pdf_dir/B5-1.pdf), ["Crowdsourced Corpus of Sentence Simplification with Core Vocabulary"](https://www.aclweb.org/anthology/L18-1072)
- **Leaderboard:** [N/A]
- **Point of Contact:** Check the homepage.
### Dataset Summary
- **SNOW T15:**
The simplified corpus for the Japanese language. The corpus has 50,000 manually simplified and aligned sentences.
This corpus contains the original sentences, simplified sentences and English translation of the original sentences.
It can be used for automatic text simplification as well as translating simple Japanese into English and vice-versa.
The core vocabulary is restricted to 2,000 words where it is selected by accounting for several factors such as meaning preservation, variation, simplicity and the UniDic word segmentation criterion.
For details, refer to the explanation page of Japanese simplification (http://www.jnlp.org/research/Japanese_simplification).
The original texts are from "small_parallel_enja: 50k En/Ja Parallel Corpus for Testing SMT Methods", which is a bilingual corpus for machine translation.
- **SNOW T23:**
An expansion corpus of 35,000 sentences rewritten in easy Japanese (simple Japanese vocabulary) based on SNOW T15.
The original texts are from "Tanaka Corpus" (http://www.edrdg.org/wiki/index.php/Tanaka_Corpus).
### Supported Tasks and Leaderboards
It can be used for automatic text simplification in Japanese as well as translating simple Japanese into English and vice-versa.
### Languages
Japanese, simplified Japanese, and English.
## Dataset Structure
### Data Instances
SNOW T15 is xlsx file with ID, "#日本語(原文)" (Japanese (original)), "#やさしい日本語" (simplified Japanese), "#英語(原文)" (English (original)).
SNOW T23 is xlsx file with ID, "#日本語(原文)" (Japanese (original)), "#やさしい日本語" (simplified Japanese), "#英語(原文)" (English (original)), and "#固有名詞" (proper noun).
### Data Fields
- `ID`: sentence ID.
- `original_ja`: original Japanese sentence.
- `simplified_ja`: simplified Japanese sentence.
- `original_en`: original English sentence.
- `proper_noun`: (included only in SNOW T23) Proper nowus that the workers has extracted as proper nouns. The authors instructed workers not to rewrite proper nouns, leaving the determination of proper nouns to the workers.
### Data Splits
The data is not split.
## Dataset Creation
### Curation Rationale
A dataset on the study of automatic conversion to simplified Japanese (Japanese simplification).
### Source Data
#### Initial Data Collection and Normalization
- **SNOW T15:**
The original texts are from "small_parallel_enja: 50k En/Ja Parallel Corpus for Testing SMT Methods", which is a bilingual corpus for machine translation.
- **SNOW T23:**
The original texts are from "Tanaka Corpus" (http://www.edrdg.org/wiki/index.php/Tanaka_Corpus).
#### Who are the source language producers?
[N/A]
### Annotations
#### Annotation process
- **SNOW T15:**
Five students in the laboratory rewrote the original Japanese sentences to simplified Japanese all by hand.
The core vocabulary is restricted to 2,000 words where it is selected by accounting for several factors such as meaning preservation, variation, simplicity and the UniDic word segmentation criterion.
- **SNOW T23:**
Seven people, gathered through crowdsourcing, rewrote all the sentences manually.
Each worker rewrote 5,000 sentences, of which 100 sentences were rewritten to be common among the workers.
The average length of the sentences was kept as close to the same as possible so that the amount of work was not varied among the workers.
#### Who are the annotators?
Five students for SNOW T15, seven crowd workers for SNOW T23.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The datasets are part of SNOW, Japanese language resources/tools created by Natural Language Processing Laboratory, Nagaoka University of Technology, Japan.
### Licensing Information
CC BY 4.0
### Citation Information
```
@inproceedings{maruyama-yamamoto-2018-simplified,
title = "Simplified Corpus with Core Vocabulary",
author = "Maruyama, Takumi and
Yamamoto, Kazuhide",
booktitle = "Proceedings of the Eleventh International Conference on Language Resources and Evaluation ({LREC} 2018)",
month = may,
year = "2018",
address = "Miyazaki, Japan",
publisher = "European Language Resources Association (ELRA)",
url = "https://www.aclweb.org/anthology/L18-1185",
}
@inproceedings{yamamoto-2017-simplified-japanese,
title = "やさしい⽇本語対訳コーパスの構築",
author = "⼭本 和英 and
丸⼭ 拓海 and
⾓張 ⻯晴 and
稲岡 夢⼈ and
⼩川 耀⼀朗 and
勝⽥ 哲弘 and
髙橋 寛治",
booktitle = "言語処理学会第23回年次大会",
month = 3月,
year = "2017",
address = "茨城, 日本",
publisher = "言語処理学会",
url = "https://www.anlp.jp/proceedings/annual_meeting/2017/pdf_dir/B5-1.pdf",
}
@inproceedings{katsuta-yamamoto-2018-crowdsourced,
title = "Crowdsourced Corpus of Sentence Simplification with Core Vocabulary",
author = "Katsuta, Akihiro and
Yamamoto, Kazuhide",
booktitle = "Proceedings of the Eleventh International Conference on Language Resources and Evaluation ({LREC} 2018)",
month = may,
year = "2018",
address = "Miyazaki, Japan",
publisher = "European Language Resources Association (ELRA)",
url = "https://www.aclweb.org/anthology/L18-1072",
}
```
### Contributions
Thanks to [@forest1988](https://github.com/forest1988), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
qgyd2021/international_voice | ---
license: apache-2.0
language:
- zh
- en
- id
- es
---
## 国际语音
国际通话中的录音,包括通话响铃阶段的录音,接通后的录音。
|
CyberHarem/vritra_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vritra/ヴリトラ/弗栗多 (Fate/Grand Order)
This is the dataset of vritra/ヴリトラ/弗栗多 (Fate/Grand Order), containing 96 images and their tags.
The core tags of this character are `blonde_hair, long_hair, dragon_horns, breasts, horns, dark_skin, dragon_girl, facial_mark, dark-skinned_female, yellow_eyes, large_breasts, swept_bangs, tail, dragon_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 96 | 155.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vritra_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 96 | 137.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vritra_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 238 | 261.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vritra_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vritra_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | looking_at_viewer, 1girl, black_dress, solo, bracelet, fur_trim, neck_ring, cleavage_cutout, sharp_teeth, white_background, grin |
| 1 | 6 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, mouth_veil, solo, long_sleeves, green_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | black_dress | solo | bracelet | fur_trim | neck_ring | cleavage_cutout | sharp_teeth | white_background | grin | mouth_veil | long_sleeves | green_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:--------------|:-------|:-----------|:-----------|:------------|:------------------|:--------------|:-------------------|:-------|:-------------|:---------------|:---------------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X |
|
Helsinki-NLP/opus_memat | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
- xh
license:
- unknown
multilinguality:
- translation
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: OpusMemat
dataset_info:
config_name: xh-en
features:
- name: translation
dtype:
translation:
languages:
- xh
- en
splits:
- name: train
num_bytes: 25400442
num_examples: 154764
download_size: 14115561
dataset_size: 25400442
configs:
- config_name: xh-en
data_files:
- split: train
path: xh-en/train-*
---
# Dataset Card for [opus_memat]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[memat](http://opus.nlpl.eu/memat.php)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Xhosa-English parallel corpora, funded by EPSRC, the Medical Machine Translation project worked on machine translation between ixiXhosa and English, with a focus on the medical domain.
### Supported Tasks and Leaderboards
The underlying task is machine translation from Xhosa to English
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
J. Tiedemann, 2012, Parallel Data, Tools and Interfaces in OPUS. In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
### Contributions
Thanks to [@spatil6](https://github.com/spatil6) for adding this dataset. |
Daftdroh/sisi | ---
license: other
---
|
open-llm-leaderboard/details_Mihaiii__Metis-0.4 | ---
pretty_name: Evaluation run of Mihaiii/Metis-0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mihaiii/Metis-0.4](https://huggingface.co/Mihaiii/Metis-0.4) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Metis-0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T18:14:22.167641](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.4/blob/main/results_2023-12-23T18-14-22.167641.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6225958370835007,\n\
\ \"acc_stderr\": 0.032727358082431525,\n \"acc_norm\": 0.6305379226640814,\n\
\ \"acc_norm_stderr\": 0.03342131270283292,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5920120177050053,\n\
\ \"mc2_stderr\": 0.01556995067121447\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192601\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n\
\ \"acc_stderr\": 0.004743808792037864,\n \"acc_norm\": 0.8390758812985462,\n\
\ \"acc_norm_stderr\": 0.003667099594023357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267025,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787582,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787582\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.03404705328653881,\n \"\
acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.03404705328653881\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417465,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5920120177050053,\n\
\ \"mc2_stderr\": 0.01556995067121447\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698352\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2221379833206975,\n \
\ \"acc_stderr\": 0.011449986902435323\n }\n}\n```"
repo_url: https://huggingface.co/Mihaiii/Metis-0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|arc:challenge|25_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|gsm8k|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hellaswag|10_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T18-14-22.167641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- '**/details_harness|winogrande|5_2023-12-23T18-14-22.167641.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T18-14-22.167641.parquet'
- config_name: results
data_files:
- split: 2023_12_23T18_14_22.167641
path:
- results_2023-12-23T18-14-22.167641.parquet
- split: latest
path:
- results_2023-12-23T18-14-22.167641.parquet
---
# Dataset Card for Evaluation run of Mihaiii/Metis-0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.4](https://huggingface.co/Mihaiii/Metis-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Metis-0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T18:14:22.167641](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.4/blob/main/results_2023-12-23T18-14-22.167641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6225958370835007,
"acc_stderr": 0.032727358082431525,
"acc_norm": 0.6305379226640814,
"acc_norm_stderr": 0.03342131270283292,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5920120177050053,
"mc2_stderr": 0.01556995067121447
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192601
},
"harness|hellaswag|10": {
"acc": 0.6550487950607449,
"acc_stderr": 0.004743808792037864,
"acc_norm": 0.8390758812985462,
"acc_norm_stderr": 0.003667099594023357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267025,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787582,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.03404705328653881,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.03404705328653881
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417465,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5920120177050053,
"mc2_stderr": 0.01556995067121447
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698352
},
"harness|gsm8k|5": {
"acc": 0.2221379833206975,
"acc_stderr": 0.011449986902435323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
whatisslove11/5_class_audio_eval | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': normal_speech
'1': whisper
'2': music
'3': scream
'4': trash
splits:
- name: train
num_bytes: 199258443.5
num_examples: 1171
download_size: 182473985
dataset_size: 199258443.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
muverrih38/2322323223 | ---
license: other
---
|
cindyangelira/hurricane_harvey_2017 | ---
license: other
---
|
open-llm-leaderboard/details_CultriX__Wernicke-7B-v8 | ---
pretty_name: Evaluation run of CultriX/Wernicke-7B-v8
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CultriX/Wernicke-7B-v8](https://huggingface.co/CultriX/Wernicke-7B-v8) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__Wernicke-7B-v8\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T02:45:02.696586](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v8/blob/main/results_2024-01-29T02-45-02.696586.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522489089327046,\n\
\ \"acc_stderr\": 0.03211210883644672,\n \"acc_norm\": 0.651592492844927,\n\
\ \"acc_norm_stderr\": 0.032786280036592515,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7130370187226833,\n\
\ \"mc2_stderr\": 0.014785526706273856\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n\
\ \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7102170882294364,\n\
\ \"acc_stderr\": 0.004527343651130796,\n \"acc_norm\": 0.8869747062338179,\n\
\ \"acc_norm_stderr\": 0.003159766252456866\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653354,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653354\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898452,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898452\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7130370187226833,\n\
\ \"mc2_stderr\": 0.014785526706273856\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562903\n }\n}\n```"
repo_url: https://huggingface.co/CultriX/Wernicke-7B-v8
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|arc:challenge|25_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|gsm8k|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hellaswag|10_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T02-45-02.696586.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- '**/details_harness|winogrande|5_2024-01-29T02-45-02.696586.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T02-45-02.696586.parquet'
- config_name: results
data_files:
- split: 2024_01_29T02_45_02.696586
path:
- results_2024-01-29T02-45-02.696586.parquet
- split: latest
path:
- results_2024-01-29T02-45-02.696586.parquet
---
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v8
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v8](https://huggingface.co/CultriX/Wernicke-7B-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CultriX__Wernicke-7B-v8",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T02:45:02.696586](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v8/blob/main/results_2024-01-29T02-45-02.696586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522489089327046,
"acc_stderr": 0.03211210883644672,
"acc_norm": 0.651592492844927,
"acc_norm_stderr": 0.032786280036592515,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7130370187226833,
"mc2_stderr": 0.014785526706273856
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244484,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.7102170882294364,
"acc_stderr": 0.004527343651130796,
"acc_norm": 0.8869747062338179,
"acc_norm_stderr": 0.003159766252456866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653354,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898452,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7130370187226833,
"mc2_stderr": 0.014785526706273856
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571778
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562903
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
datahrvoje/twitter_dataset_1712733400 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17513
num_examples: 42
download_size: 13574
dataset_size: 17513
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pythainlp/thailaw-v1.0 | ---
language:
- th
license: cc0-1.0
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 920732139
num_examples: 52556
download_size: 212104476
dataset_size: 920732139
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- legal
---
# Dataset Card for "ThaiLaw v1.0"
## English
Thai Law Dataset (Act of Parliament) v1.0
- Data source from Office of the Council of State, Thailand [https://www.krisdika.go.th/](https://www.krisdika.go.th/) and [law.go.th](https://law.go.th/).
- This part of [PyThaiNLP Project](https://github.com/PyThaiNLP/).
- License Dataset is public domain.
## Thai
คลังข้อมูลกฎหมายไทย (พระราชบัญญัติ) รุ่น 1.0
- ข้อมูลเก็บรวบรวมมาจากเว็บไซต์สำนักงานคณะกรรมการกฤษฎีกา [https://www.krisdika.go.th/](https://www.krisdika.go.th/) และ [law.go.th](https://law.go.th/)
- โครงการนี้เป็นส่วนหนึ่งในแผนพัฒนา [PyThaiNLP](https://github.com/PyThaiNLP/)
- ข้อมูลที่รวบรวมในคลังข้อความนี้เป็นสาธารณสมบัติ (public domain) ตามพ.ร.บ.ลิขสิทธิ์ พ.ศ. 2537 มาตรา 7 (สิ่งต่อไปนี้ไม่ถือว่าเป็นงานอันมีลิขสิทธิ์ตามพระราชบัญญัตินี้ (1) ข่าวประจำวัน และข้อเท็จจริงต่างๆ ที่มีลักษณะเป็นเพียงข่าวสารอันมิใช่งานในแผนกวรรณคดี แผนกวิทยาศาสตร์ หรือแผนกศิลปะ [...] (3) ระเบียบ ข้อบังคับ ประกาศ คำสั่ง คำชี้แจง และหนังสือตอบโต้ของกระทรวง ทบวง กรม หรือหน่วยงานอื่นใดของรัฐหรือของท้องถิ่น [...])
## Citations
If you use `ThaiLaw` in your project or publication, please cite the dataset as follows:
```bib
@misc{thailaw,
doi = {10.5281/ZENODO.10701494},
url = {https://zenodo.org/doi/10.5281/zenodo.10701494},
author = {Phatthiyaphaibun, Wannaphong},
language = {th},
title = {ThaiLaw: Thai Law Dataset},
publisher = {Zenodo},
year = {2024},
copyright = {Creative Commons Zero v1.0 Universal}
}
```
Zenodo: [https://zenodo.org/records/10701494](https://zenodo.org/records/10701494) |
Doub7e/SDv2-Count-Repeated-5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
- name: style
dtype: string
splits:
- name: train
num_bytes: 1476708642.5
num_examples: 1140
download_size: 1286922660
dataset_size: 1476708642.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Taegyuu/test1 | ---
license: unknown
---
|
CyberHarem/scene_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scene/シーン/稀音 (Arknights)
This is the dataset of scene/シーン/稀音 (Arknights), containing 135 images and their tags.
The core tags of this character are `ahoge, hair_ornament, yellow_eyes, short_hair, one_side_up, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 135 | 202.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scene_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 135 | 169.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scene_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 324 | 332.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scene_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scene_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, collared_shirt, looking_at_viewer, open_jacket, red_necktie, solo, white_jacket, white_shirt, long_sleeves, simple_background, upper_body, closed_mouth, hooded_jacket, white_background, brown_vest, holding_camera, hood_down, blush, green_eyes, hair_between_eyes |
| 1 | 14 |  |  |  |  |  | 1girl, solo, long_sleeves, wide_sleeves, hair_bow, looking_at_viewer, oil-paper_umbrella, ponytail, black_bow, blush, closed_mouth, holding_umbrella, white_background, blonde_hair, obi, white_kimono, red_skirt, simple_background, black_footwear, boots, floral_print, full_body, white_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | looking_at_viewer | open_jacket | red_necktie | solo | white_jacket | white_shirt | long_sleeves | simple_background | upper_body | closed_mouth | hooded_jacket | white_background | brown_vest | holding_camera | hood_down | blush | green_eyes | hair_between_eyes | wide_sleeves | hair_bow | oil-paper_umbrella | ponytail | black_bow | holding_umbrella | blonde_hair | obi | white_kimono | red_skirt | black_footwear | boots | floral_print | full_body | white_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:--------------|:--------------|:-------|:---------------|:--------------|:---------------|:--------------------|:-------------|:---------------|:----------------|:-------------------|:-------------|:-----------------|:------------|:--------|:-------------|:--------------------|:---------------|:-----------|:---------------------|:-----------|:------------|:-------------------|:--------------|:------|:---------------|:------------|:-----------------|:--------|:---------------|:------------|:------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | | | X | | | X | X | | X | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/triandra_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of triandra (Fire Emblem)
This is the dataset of triandra (Fire Emblem), containing 88 images and their tags.
The core tags of this character are `blue_eyes, breasts, wings, hair_ornament, hair_over_one_eye, facial_mark, butterfly_wings, hair_flower, fairy_wings, purple_hair, large_breasts, medium_breasts, medium_hair, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 88 | 151.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/triandra_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 88 | 79.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/triandra_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 218 | 172.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/triandra_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 88 | 128.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/triandra_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 218 | 244.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/triandra_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/triandra_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, bridal_gauntlets, cleavage_cutout, full_body, gradient_clothes, rose, simple_background, solo, thorns, bangs, black_footwear, center_opening, covered_navel, floating_object, shiny_hair, gradient_background, looking_at_viewer, open_mouth, sleeveless_dress, white_background, detached_sleeves, fairy, grey_background, knee_boots, long_hair, petals, shiny_skin, thighs, torn_clothes |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, midriff, navel, solo, thorns, vines, pink_hair, butterfly, detached_sleeves, parted_lips, thighs, alternate_costume, center_opening, cleavage_cutout, fairy, revealing_clothes, rose |
| 2 | 5 |  |  |  |  |  | 1girl, blush, open_mouth, bare_shoulders, vaginal, 1boy, cleavage, hetero, mosaic_censoring, penis, pussy_juice, spread_legs, sweat, thorns, vines, clothed_sex, detached_sleeves, long_hair, looking_at_viewer, navel, nipples, rose, saliva, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | bridal_gauntlets | cleavage_cutout | full_body | gradient_clothes | rose | simple_background | solo | thorns | bangs | black_footwear | center_opening | covered_navel | floating_object | shiny_hair | gradient_background | looking_at_viewer | open_mouth | sleeveless_dress | white_background | detached_sleeves | fairy | grey_background | knee_boots | long_hair | petals | shiny_skin | thighs | torn_clothes | midriff | navel | vines | pink_hair | butterfly | parted_lips | alternate_costume | revealing_clothes | blush | vaginal | 1boy | cleavage | hetero | mosaic_censoring | penis | pussy_juice | spread_legs | sweat | clothed_sex | nipples | saliva | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-------------------|:------------------|:------------|:-------------------|:-------|:--------------------|:-------|:---------|:--------|:-----------------|:-----------------|:----------------|:------------------|:-------------|:----------------------|:--------------------|:-------------|:-------------------|:-------------------|:-------------------|:--------|:------------------|:-------------|:------------|:---------|:-------------|:---------|:---------------|:----------|:--------|:--------|:------------|:------------|:--------------|:--------------------|:--------------------|:--------|:----------|:-------|:-----------|:---------|:-------------------|:--------|:--------------|:--------------|:--------|:--------------|:----------|:---------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | | | X | | X | X | | | X | | | | | X | | | | X | X | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | | | X | | | X | | | | | | | | X | X | | | X | | | | X | | | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
naphatmanu/index-natural-scandinavian-1 | ---
license: mit
---
|
CyberHarem/hanako_honda_asobiasobase | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hanako Honda
This is the dataset of Hanako Honda, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 674 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 674 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 674 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 674 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
marvmk/scalable_project | ---
dataset_info:
features:
- name: Open
dtype: float64
- name: High
dtype: float64
- name: Low
dtype: float64
- name: Close
dtype: float64
- name: Volume
dtype: int64
- name: Inflation
dtype: float64
- name: CPI
dtype: float64
- name: Quarter_end
dtype: int64
- name: Date
dtype: timestamp[ns, tz=America/New_York]
splits:
- name: train
num_bytes: 359424
num_examples: 4992
download_size: 0
dataset_size: 359424
---
# Dataset Card for "scalable_project"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZBWatHF/Funder-NER | ---
license: cc-by-nc-sa-4.0
task_categories:
- question-answering
- token-classification
language:
- en
pretty_name: Named Entity Recognition of funders of scientific research
size_categories:
- n<1K
---
# Dataset Card for Dataset Named Entity Recognition of funders of scientific research
## Dataset Description
- **Homepage:https://econstor.eu**
- **Repository:https://github.com/zbw/Funder-NER**
- **Paper: https://doi.org/10.1007/978-3-031-16802-4_24**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Training/test set for automatically identifying funder entities mentioned in scientific papers. This data set is generated from Open Access documents hosted at https://econstor.eu and manually curated/labeled.
### Supported Tasks and Leaderboards
The dataset is for training and testing the automatic recognition of funders as they are acknowledged in scientific papers.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
"handle: the PID of the OA working paper"
"crossref_doi: the PID of the corresponding publication (article), as registered via CrossRef"
"crossref_phrase: the funder according to CrossRef metadata"
"pdf_phrase: the acknowledgement phrase from the paper"
"funder: Y(es) in case there is at least one funder, N(o) in case there is no funder supporting the work, but OA publishing funding"
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
The dataset was manually curated to train the NER recognition based on a QA system.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PROCESOS/IneFrontal | ---
license: c-uda
---
|
yzhuang/autotree_snnxor_n0_l2_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 371440000
num_examples: 10000
- name: validation
num_bytes: 371440000
num_examples: 10000
- name: test
num_bytes: 371440000
num_examples: 10000
download_size: 349774676
dataset_size: 1114320000
---
# Dataset Card for "autotree_snnxor_n0_l2_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Binaryy/multimodal-real-estate-search | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: Title
dtype: string
- name: Location
dtype: string
- name: Details
dtype: string
splits:
- name: train
num_bytes: 70812888.372
num_examples: 1041
download_size: 70215648
dataset_size: 70812888.372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multimodal-real-estate-search"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713033201 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21558
num_examples: 50
download_size: 12662
dataset_size: 21558
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
piercemaloney/coqgym_ttv_split_v2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 47541685
num_examples: 641
- name: test
num_bytes: 6100071
num_examples: 282
download_size: 7408688
dataset_size: 53641756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
donbatatone/seganta | ---
license: openrail
---
|
CyberHarem/vikala_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vikala/ビカラ (Granblue Fantasy)
This is the dataset of vikala/ビカラ (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `red_eyes, bangs, animal_ears, mouse_ears, hair_ornament, short_hair, bow, fake_animal_ears, white_hair, hairclip, hair_bow, hairband, red_bow, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 921.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 465.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1362 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 793.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1362 | 1.60 GiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vikala_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_hair, blush, hair_bobbles, strapless_shirt, sun_hat, short_twintails, wrist_scrunchie, bare_shoulders, low_twintails, navel, midriff, black_shorts, crop_top, black_shirt, collarbone, short_shorts, water, closed_mouth, shoulder_bag, small_breasts, white_jacket, mouse, outdoors, smile |
| 1 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, penis, sex, vaginal, nipples, solo_focus, navel, cum_in_pussy, spread_legs, bar_censor, small_breasts, bikini, looking_at_viewer, medium_breasts, mosaic_censoring |
| 2 | 25 |  |  |  |  |  | 1girl, bowtie, crop_top, heart_brooch, long_sleeves, looking_at_viewer, midriff, solo, white_shirt, white_skirt, wide_sleeves, navel, pleated_skirt, miniskirt, collared_shirt, open_mouth, mouse, :d, blush, animal, cowboy_shot, frilled_skirt, holding_balloon, white_background, grey_hair |
| 3 | 60 |  |  |  |  |  | 1girl, looking_at_viewer, solo, eyewear_on_head, hair_flower, blush, striped_bikini, sunglasses, navel, bare_shoulders, open_mouth, outdoors, white_skirt, wrist_scrunchie, small_breasts, day, water, bikini_skirt, cleavage, :d, ocean, blue_sky, bridal_garter, choker, frills |
| 4 | 15 |  |  |  |  |  | 1girl, black_hair, long_sleeves, black_skirt, looking_at_viewer, blush, pleated_skirt, sailor_collar, closed_mouth, mouse, solo, blue_bow, blue_jacket, sleeves_past_wrists, white_background, white_shirt, white_thighhighs, bag, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_hair | blush | hair_bobbles | strapless_shirt | sun_hat | short_twintails | wrist_scrunchie | bare_shoulders | low_twintails | navel | midriff | black_shorts | crop_top | black_shirt | collarbone | short_shorts | water | closed_mouth | shoulder_bag | small_breasts | white_jacket | mouse | outdoors | smile | 1boy | hetero | open_mouth | penis | sex | vaginal | nipples | solo_focus | cum_in_pussy | spread_legs | bar_censor | bikini | medium_breasts | mosaic_censoring | bowtie | heart_brooch | long_sleeves | white_shirt | white_skirt | wide_sleeves | pleated_skirt | miniskirt | collared_shirt | :d | animal | cowboy_shot | frilled_skirt | holding_balloon | white_background | grey_hair | eyewear_on_head | hair_flower | striped_bikini | sunglasses | day | bikini_skirt | cleavage | ocean | blue_sky | bridal_garter | choker | frills | black_skirt | sailor_collar | blue_bow | blue_jacket | sleeves_past_wrists | white_thighhighs | bag | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------|:--------|:---------------|:------------------|:----------|:------------------|:------------------|:-----------------|:----------------|:--------|:----------|:---------------|:-----------|:--------------|:-------------|:---------------|:--------|:---------------|:---------------|:----------------|:---------------|:--------|:-----------|:--------|:-------|:---------|:-------------|:--------|:------|:----------|:----------|:-------------|:---------------|:--------------|:-------------|:---------|:-----------------|:-------------------|:---------|:---------------|:---------------|:--------------|:--------------|:---------------|:----------------|:------------|:-----------------|:-----|:---------|:--------------|:----------------|:------------------|:-------------------|:------------|:------------------|:--------------|:-----------------|:-------------|:------|:---------------|:-----------|:--------|:-----------|:----------------|:---------|:---------|:--------------|:----------------|:-----------|:--------------|:----------------------|:-------------------|:------|:--------------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | X | | | | | | | | X | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 25 |  |  |  |  |  | X | X | X | | X | | | | | | | | X | X | | X | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 60 |  |  |  |  |  | X | X | X | | X | | | | | X | X | | X | | | | | | | X | | | X | | | X | | | | X | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
andreabac3/MedQuaAD-Italian-Fauno-Baize | ---
license: gpl-3.0
---
# MedQuaAD-Italian-Fauno-Baize
This dataset is an Italian translation of the MedQuaAD dataset presented by Baize's authors.
## Dataset Description
- **Paper:** https://arxiv.org/abs/2304.01196
### Languages
Italian
## Dataset Structure
### Data Instances
Sentences 46,867
average number of turns 3.8
response lengths of each turn 35.8
### Data Fields
topic, input
### Data Splits
Train
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
https://github.com/project-baize/baize-chatbot
## Additional Information
### Dataset Curators
[Andrea Bacciu](https://andreabac3.github.io/), Dr. [Giovanni Trappolini](https://sites.google.com/view/giovannitrappolini), [Andrea Santilli](https://www.santilli.xyz/), and Professor [Fabrizio Silvestri](https://sites.google.com/diag.uniroma1.it/fabriziosilvestri/home).
### Licensing Information
This project is a derivative of Baize, and we adhere to the licensing constraints imposed by Baize's creators.
### Citation Information
```bibtex
@misc{fauno,
author = {Andrea Bacciu, Giovanni Trappolini, Andrea Santilli, Fabrizio Silvestri},
title = {Fauno: The Italian Large Language Model that will leave you senza parole!},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/andreabac3/Fauno-Italian-LLM}},
}
```
```bibtex
@article{xu2023baize,
title={Baize: An Open-Source Chat Model with Parameter-Efficient Tuning on Self-Chat Data},
author={Xu, Canwen and Guo, Daya and Duan, Nan and McAuley, Julian},
journal={arXiv preprint arXiv:2304.01196},
year={2023}
}
```
|
Bena345/diabetes-readmission | ---
license: mit
language:
- en
tags:
- medical
pretty_name: Diabetes Readmissions
---
# Data source:
Clore,John, Cios,Krzysztof, DeShazo,Jon, and Strack,Beata. (2014).
Diabetes 130-US hospitals for years 1999-2008. UCI Machine Learning
Repository. https://doi.org/10.24432/C5230J.
# Basic data preprocessing was based on this [notebook](https://github.com/csinva/imodels-data/blob/master/notebooks_fetch_data/00_get_datasets_custom.ipynb).
# To load raw train and test sets
from datasets import load_dataset
train_set = load_dataset(dataset_name, data_files="train.csv")
test_set = load_dataset(dataset_name, data_files="test.csv")
# To load preprocessed train set
from datasets import load_dataset
preprocessed_train_set = load_dataset(dataset_name, data_files="preprocessed_train_set.csv")
|
Asad321/MKBHD-FJbW9icR-scraped-data-Final-Evaluation-Demo | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 849
num_examples: 2
download_size: 4390
dataset_size: 849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MKBHD-FJbW9icR-scraped-data-Final-Evaluation-Demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phyloforfun/HLT_Kew_WCVP_SLTPvA_v1-0_full__T20-OCR-C25-L25-E50-R10 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1488879842
num_examples: 1422868
download_size: 189039917
dataset_size: 1488879842
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibivibiv/alpaca_tasksource8 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135485283
num_examples: 253970
download_size: 76922724
dataset_size: 135485283
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
merror/custom | ---
license: other
---
|
Rasi1610/Deathce502_series2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 89881535.0
num_examples: 148
- name: val
num_bytes: 22898332.0
num_examples: 38
download_size: 112755919
dataset_size: 112779867.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
CyberHarem/mizumoto_yukari_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mizumoto_yukari/水本ゆかり (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mizumoto_yukari/水本ゆかり (THE iDOLM@STER: Cinderella Girls), containing 369 images and their tags.
The core tags of this character are `brown_hair, long_hair, brown_eyes, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 369 | 356.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizumoto_yukari_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 369 | 240.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizumoto_yukari_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 763 | 462.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizumoto_yukari_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 369 | 331.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizumoto_yukari_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 763 | 600.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizumoto_yukari_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mizumoto_yukari_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, short_sleeves, white_background, blue_skirt, collared_shirt, school_uniform, white_shirt, blush, open_mouth, :d, pleated_skirt, blue_bow, blue_ribbon, closed_mouth, neck_ribbon |
| 1 | 8 |  |  |  |  |  | 1girl, school_uniform, skirt, smile, looking_at_viewer, solo, blush, flute |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, smile, solo, dress, hair_ornament, necklace, blush, microphone, bare_shoulders, braid, bracelet, earrings, medium_breasts |
| 3 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, smile, solo, medium_breasts, cleavage, sailor_bikini, white_bikini |
| 4 | 15 |  |  |  |  |  | 1girl, competition_swimsuit, looking_at_viewer, medium_breasts, solo, cowboy_shot, red_one-piece_swimsuit, blush, collarbone, covered_navel, smile, white_background, simple_background, standing, pink_one-piece_swimsuit |
| 5 | 6 |  |  |  |  |  | smile, strapless_dress, wedding_dress, 1girl, bare_shoulders, necklace, solo, bridal_veil, earrings, hair_flower, looking_at_viewer, white_dress, white_gloves, blush, bouquet, collarbone, holding, open_mouth |
| 6 | 8 |  |  |  |  |  | 1boy, blush, hetero, penis, 1girl, mosaic_censoring, solo_focus, sweat, vaginal, nipples, open_mouth, pussy, white_shirt, long_sleeves, medium_breasts, clothed_sex, navel, skirt, spread_legs, heart, on_back, panties_aside, saliva, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | short_sleeves | white_background | blue_skirt | collared_shirt | school_uniform | white_shirt | blush | open_mouth | :d | pleated_skirt | blue_bow | blue_ribbon | closed_mouth | neck_ribbon | skirt | smile | flute | dress | hair_ornament | necklace | microphone | bare_shoulders | braid | bracelet | earrings | medium_breasts | navel | cleavage | sailor_bikini | white_bikini | competition_swimsuit | cowboy_shot | red_one-piece_swimsuit | collarbone | covered_navel | standing | pink_one-piece_swimsuit | strapless_dress | wedding_dress | bridal_veil | hair_flower | white_dress | white_gloves | bouquet | holding | 1boy | hetero | penis | mosaic_censoring | solo_focus | sweat | vaginal | nipples | pussy | long_sleeves | clothed_sex | spread_legs | heart | on_back | panties_aside | saliva | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:----------------|:-------------------|:-------------|:-----------------|:-----------------|:--------------|:--------|:-------------|:-----|:----------------|:-----------|:--------------|:---------------|:--------------|:--------|:--------|:--------|:--------|:----------------|:-----------|:-------------|:-----------------|:--------|:-----------|:-----------|:-----------------|:--------|:-----------|:----------------|:---------------|:-----------------------|:--------------|:-------------------------|:-------------|:----------------|:-----------|:--------------------------|:------------------|:----------------|:--------------|:--------------|:--------------|:---------------|:----------|:----------|:-------|:---------|:--------|:-------------------|:-------------|:--------|:----------|:----------|:--------|:---------------|:--------------|:--------------|:--------|:----------|:----------------|:---------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | | | X | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | | | | | | | X | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | | | | | | | X | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | X | X | X | | X | | | | | X | | | | | | | | | X | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | | | | | | | X | X | | | | | | | | X | | | | X | | X | | | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | | | | | | X | X | X | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sartmis1/wikisql-processed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: messages
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10327196
num_examples: 56355
- name: test
num_bytes: 2917591
num_examples: 15878
- name: validation
num_bytes: 2917591
num_examples: 15878
download_size: 0
dataset_size: 16162378
---
# Dataset Card for "wikisql-processed"
Based out of [wikisql](https://huggingface.co/datasets/wikisql)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_en_wikipedia | ---
language: en
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_en_wikipedia
# wikipedia
- Dataset uid: `wikipedia`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 3.2299 % of total
- 4.2071 % of en
- 5.6773 % of ar
- 3.3416 % of fr
- 5.2815 % of es
- 12.4852 % of ca
- 0.4288 % of zh
- 0.4286 % of zh
- 5.4743 % of indic-bn
- 8.9062 % of indic-ta
- 21.3313 % of indic-te
- 4.4845 % of pt
- 4.0493 % of indic-hi
- 11.3163 % of indic-ml
- 22.5300 % of indic-ur
- 4.4902 % of vi
- 16.9916 % of indic-kn
- 24.7820 % of eu
- 11.6241 % of indic-mr
- 9.8749 % of id
- 9.3489 % of indic-pa
- 9.4767 % of indic-gu
- 24.1132 % of indic-as
- 5.3309 % of indic-or
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ca
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: zh
#### Filters applied to: zh
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-or
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
|
jan-hq/astramindai_nectar_sft_binarized | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 217467579.87271437
num_examples: 118080
- name: test
num_bytes: 24164906.127285615
num_examples: 13121
download_size: 126996278
dataset_size: 241632486.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "astramindai_nectar_sft_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
version-control/the-stack-ds-lib-100k-trace-version | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: hexsha
sequence: string
- name: file_path
sequence: string
- name: code
sequence: string
- name: apis
sequence:
sequence: string
- name: possible_versions
list:
- name: matplotlib
sequence: 'null'
- name: numpy
sequence: string
- name: pandas
sequence: string
- name: scipy
sequence: string
- name: tensorflow
sequence: string
splits:
- name: train
num_bytes: 1216651723
num_examples: 73810
download_size: 432865008
dataset_size: 1216651723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ivv101/test_dataset | ---
license: cc0-1.0
---
|
ruliad/TemplateGSM | ---
license: cc-by-4.0
task_categories:
- text-generation
- question-answering
language:
- en
pretty_name: TemplateGSM
size_categories:
- 1B<n<10B
configs:
- config_name: templategsm-7473-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- data/1k/2000-3999/*.jsonl
- data/1k/4000-7472/*.jsonl
default: true
- config_name: templategsm-4000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- data/1k/2000-3999/*.jsonl
- config_name: templategsm-2000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- config_name: templategsm-1000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
tags:
- mathematical-reasoning
- reasoning
- finetuning
- pretraining
- llm
---
# Training Language Models with Syntactic Data Generation
## TemplateGSM Dataset
The TemplateGSM dataset is a novel and extensive collection containing **over 7 million (up to infinite) grade school math problems** with code solutions and natural language solutions designed for advancing the study and application of mathematical reasoning within the realm of language modeling and AI. This dataset is crafted to challenge and evaluate the capabilities of language models in understanding and generating solutions to mathematical problems derived from a set of **7473** predefined **problem templates** using examples from the GSM8K dataset as prototypes. Each template encapsulates a unique mathematical problem structure, offering a diverse array of challenges that span various domains of mathematics.
GitHub Homepage: https://github.com/iiis-ai/TemplateMath
## Objective
TemplateGSM aims to serve as a benchmark for:
- Assessing language models' proficiency in mathematical reasoning and symbolic computation.
- Training and fine-tuning language models to improve their performance in generating accurate and logically sound mathematical solutions.
- Encouraging the development of models capable of understanding and solving complex mathematical problems, thereby bridging the gap between natural language processing and mathematical reasoning.
## Dataset Structure
TemplateGSM is organized into configurations based on the volume of problems generated from each template:
### Configurations
- **templategsm-1000-1k**: Contains 1000 * 1k problems generated from each of the 1000 templates (template 0000-0999), totaling over 1 million individual problems.
- **templategsm-2000-1k**: Contains 2000 * 1k problems generated from each of the 2000 templates (template 0000-1999), culminating in a dataset with 2 million problems.
- **templategsm-4000-1k**: Contains 4000 * 1k problems generated from each of the 4000 templates (template 0000-3999), culminating in a dataset with 4 million problems.
- **templategsm-7473-1k**: Contains 7473 * 1k problems generated from each of the 7473 templates (template 0000-7472), culminating in a dataset with over 7.47 million problems.
### Data Fields
Each problem in the dataset includes the following fields:
- `problem`: The problem statement.
- `solution_code`: A commented solution code that solves the problem in Python.
- `result`: The final answer to the problem.
- `solution_wocode`: The solution in natural language without the use of code.
- `source`: This field indicates the template is constructed from which data source and which seed is used in problem generation, e.g., `gsm8k-train-round2-seed42`.
- `template_id`: This field indicates the template from which the problem was generated, e.g., `0`.
- `problem_id`: An index unique to each problem within its template.
## How to Use
```XML
configs:
- config_name: templategsm-7473-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- data/1k/2000-3999/*.jsonl
- data/1k/4000-7472/*.jsonl
default: true
- config_name: templategsm-4000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- data/1k/2000-3999/*.jsonl
- config_name: templategsm-2000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
- data/1k/1000-1999/*.jsonl
- config_name: templategsm-1000-1k
data_files:
- split: train
path:
- data/1k/0000-0999/*.jsonl
```
To access the TemplateGSM dataset, you can use the Huggingface `datasets` library:
```python
from datasets import load_dataset
# Load a specific configuration
dataset = load_dataset("math-ai/TemplateGSM", "templategsm-4000-1k") # or any valid config_name
```
## License
This dataset is made available under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.
## Citation
If you utilize Syntactic Data Generation (SDG) or the TemplateGSM dataset in your research or application, please consider citing it (GitHub Homepage: https://github.com/iiis-ai/TemplateMath):
```bibtex
@misc{zhang2024training,
title={Training Language Models with Syntactic Data Generation},
author={Zhang, Yifan and Luo, Yifan and Yuan, Yang and Yao, Andrew Chi-Chih},
year={2024},
}
|
Jing24/seperate_2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6921907
num_examples: 7848
download_size: 1327593
dataset_size: 6921907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/Voxceleb1_test_original | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
splits:
- name: test
num_bytes: 1291123262.75
num_examples: 4874
download_size: 1288168122
dataset_size: 1291123262.75
---
# Dataset Card for "Voxceleb1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0141c882 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 172
num_examples: 10
download_size: 1314
dataset_size: 172
---
# Dataset Card for "0141c882"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jjz5463/topics_common_crawl | ---
size_categories:
- n<1K
dataset_info:
features:
- name: Common crawl text
dtype: string
- name: Topics
dtype: string
splits:
- name: train
num_bytes: 469131
num_examples: 100
download_size: 246783
dataset_size: 469131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
library_name: datadreamer
tags:
- datadreamer
- datadreamer-0.25.0
- synthetic
- gpt-4
---
# Dataset Card
[Add more information here](https://huggingface.co/datasets/templates/dataset-card-example)
---
This dataset was produced with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card can be found [here](datadreamer.json). |
HES-XPLAIN/SportsImageClassification | ---
license: cc0-1.0
---
## Sports Image Classification dataset
From Kaggle: [100 Sports Image Classification](https://www.kaggle.com/datasets/ponrajsubramaniian/sportclassificationdataset)
Collection of sports images covering 100 different sports. Images are 224x224x3 jpg format.
Data is separated into train, test and valid directories.
* 13493 train images
* 500 test images
* 500 validate images
Additionallly a csv file is included for those that wish to use it to create there own train, test and validation datasets.
### Clone
```
git clone https://huggingface.co/datasets/HES-XPLAIN/SportsImageClassification
```
or
```
git clone git@hf.co:datasets/HES-XPLAIN/SportsImageClassification
```
### Modify dataset
To add data, ensure to install LFS.
```
git lfs install
```
Then proceed accordingly with `git add` and `git push`.
|
q734738781/Chem-Catalysis-Chat | ---
license: apache-2.0
---
QA pairs generated by GPT 3.5 Turbo by self-instruction methods with literature papers about catalysis topic.
The prompt used to generate this dataset is as follows:
```
Please raise 10 valuable scientific fact problems and their answers according mentioned in the text and output as the format of question-answer pairs, which would be used as prompts for LLMs.
The questions should be related to the text, and the answers should be able to be inferred from the text.
The questions should be diverse and cover different aspects of the text.
Avoid ask very common questions, like simple definition questions. You are encouraged generate more complex reasoning problems and problems with material properties.
Try to generate questions to cover every aspect of the presented text, but avoid generating repeated questions.
Notice: You are generating prompts for LLMs, so the following guidelines should be followed:
1. Answer using the data from provided information.
2. For content with conference, like Eqn 1, etc., please find the specific equation and the name of reference and answer the question.
If it is not provided, please avoid generating such questions. Specially, Figures are not included in the provided text, so please avoid generating such questions unless information could be inferred from caption.
3. QA pairs with unspecified content when reading independently are prohibited to be asked as questions。
DO NOT include "this", "that", "previous work", "the paper", "this work", "the text", "the research", "the authors" in your questions. Just focus on the scientific facts.
4. For property related problems, add detail to answers as much as possible, such as answer the specific chemical elements and numbers.
5. Do not mention the contribution of the paper, the significance of the research, the novelty of the work, etc. Focus on the scientific facts.
Your response format should be as follows:
Q: What is the formula of water?
A: The formula of water is H2O.
...
Here is the provided text:
{NEW_TEXT_PROMPT}
Remember:
QA pairs with unspecified content when reading independently are prohibited to be asked as questions。
DO NOT include "this", "that", "previous work", "the paper", "this work", "the text", "the research", "the authors" in your questions. Just focus on the scientific facts.
```
Where {NEW_TEXT_PROMPT} is substituted with actual literature part section by section to generate the dataset. |
Minnyeong/aihub_NL2SQ | ---
license: other
language:
- ko
size_categories:
- 100K<n<1M
--- |
tyzhu/squad_qa_wrong_rare_v5_full_recite_full_passage_random_permute_rerun_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5213538.444244605
num_examples: 2875
- name: validation
num_bytes: 587391
num_examples: 300
download_size: 1542699
dataset_size: 5800929.444244605
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_full_passage_random_permute_rerun_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/multilingual-llava-bench | ---
dataset_info:
- config_name: arabic
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22342774.0
num_examples: 60
download_size: 9778993
dataset_size: 22342774.0
- config_name: bengali
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22378020.0
num_examples: 60
download_size: 9783130
dataset_size: 22378020.0
- config_name: chinese
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22317502.0
num_examples: 60
download_size: 9772605
dataset_size: 22317502.0
- config_name: french
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22327391.0
num_examples: 60
download_size: 9773783
dataset_size: 22327391.0
- config_name: hindi
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22385129.0
num_examples: 60
download_size: 9799590
dataset_size: 22385129.0
- config_name: japanese
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22333016.0
num_examples: 60
download_size: 9782382
dataset_size: 22333016.0
- config_name: russian
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22355236.0
num_examples: 60
download_size: 9792575
dataset_size: 22355236.0
- config_name: spanish
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22326471.0
num_examples: 60
download_size: 9781970
dataset_size: 22326471.0
- config_name: urdu
features:
- name: question_id
dtype: int64
- name: image
dtype: image
- name: question
dtype: string
- name: caption
dtype: string
- name: image_id
dtype: string
- name: gpt_answer
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22349409.0
num_examples: 60
download_size: 9784751
dataset_size: 22349409.0
configs:
- config_name: arabic
data_files:
- split: train
path: arabic/train-*
- config_name: bengali
data_files:
- split: train
path: bengali/train-*
- config_name: chinese
data_files:
- split: train
path: chinese/train-*
- config_name: french
data_files:
- split: train
path: french/train-*
- config_name: hindi
data_files:
- split: train
path: hindi/train-*
- config_name: japanese
data_files:
- split: train
path: japanese/train-*
- config_name: russian
data_files:
- split: train
path: russian/train-*
- config_name: spanish
data_files:
- split: train
path: spanish/train-*
- config_name: urdu
data_files:
- split: train
path: urdu/train-*
---
|
CyberHarem/poporon_jashinchandropkick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ぽぽろん
This is the dataset of ぽぽろん, containing 269 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 269 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 668 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 269 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 269 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 269 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 269 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 269 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 668 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 668 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 668 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
irlab-udc/metahate-sample | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: metahate-sample
size_categories:
- n<1K
---
# MetaHate: A Dataset for Unifying Efforts on Hate Speech Detection (SAMPLE)
This is a 100-entry sample of a meta-collection of 36 hate speech datasets from social media comments.
## Dataset Structure
The original dataset contains 1,226,202 social media posts in a TSV file. This is a sample of 100 entries. Each element contains the following fields:
| Field Name | Type | Possible Values | Description |
|------------|------|-----------------|----------------------------------------------------------------------|
| text | str | any | Social media post. Each post is unique. |
| label | int | 0, 1 | Label of the post. 0 for non-hate speech posts, 1 for hate speech. | |
TuhinColumbia/FLUTE | ---
license: afl-3.0
---
|
ruanchaves/assin_por_Latn_to_eng_Latn | ---
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
'2': PARAPHRASE
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 993418
num_examples: 5000
- name: test
num_bytes: 777672
num_examples: 4000
- name: validation
num_bytes: 198351
num_examples: 1000
download_size: 0
dataset_size: 1969441
---
# Dataset Card for "assin_por_Latn_to_eng_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
canristiian/drug_rules_sort1 | ---
license: apache-2.0
---
|
Rebecca19990101/PetroGPT-Instruct-v1 | ---
license: cc-by-nc-sa-4.0
---
|
arbors-robotics/friscoisd_course_list | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.