datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
GEM-submissions/lewtun__hugging-face-test-t5-base.outputs.json-36bf2a59__1646051364 | ---
benchmark: gem
type: prediction
submission_name: Hugging Face test T5-base.outputs.json 36bf2a59
---
|
Zhongxing0129/authorlist_train | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Austen
'1': Wilde
'2': Tolstoy
'3': Dickens
splits:
- name: train
num_bytes: 2506079.861876742
num_examples: 5812
download_size: 1671550
dataset_size: 2506079.861876742
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EleutherAI/mutual | ---
license: other
--- |
liuyanchen1015/MULTI_VALUE_mnli_serial_verb_go | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 55092
num_examples: 228
- name: dev_mismatched
num_bytes: 93872
num_examples: 382
- name: test_matched
num_bytes: 69516
num_examples: 272
- name: test_mismatched
num_bytes: 95584
num_examples: 370
- name: train
num_bytes: 2594262
num_examples: 10548
download_size: 1782968
dataset_size: 2908326
---
# Dataset Card for "MULTI_VALUE_mnli_serial_verb_go"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_13 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30390992338.0
num_examples: 267255
download_size: 30153778852
dataset_size: 30390992338.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
Shubh8434/All_1 | ---
license: apache-2.0
---
|
tessiw/german_OpenOrca5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 419339635
num_examples: 250000
download_size: 240779567
dataset_size: 419339635
---
# Dataset Card for "german_OpenOrca5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LeroyDyer__SpydazWeb_AI_BASE_128k | ---
pretty_name: Evaluation run of LeroyDyer/SpydazWeb_AI_BASE_128k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/SpydazWeb_AI_BASE_128k](https://huggingface.co/LeroyDyer/SpydazWeb_AI_BASE_128k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__SpydazWeb_AI_BASE_128k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T21:49:44.739166](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__SpydazWeb_AI_BASE_128k/blob/main/results_2024-03-21T21-49-44.739166.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.638316387744326,\n\
\ \"acc_stderr\": 0.03233739079830125,\n \"acc_norm\": 0.641820651359968,\n\
\ \"acc_norm_stderr\": 0.03298269279574279,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5782296636685702,\n\
\ \"mc2_stderr\": 0.015300937426837897\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910474,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6617207727544314,\n\
\ \"acc_stderr\": 0.004721571443354415,\n \"acc_norm\": 0.8462457677753435,\n\
\ \"acc_norm_stderr\": 0.0035997580435468074\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884403,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884403\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n\
\ \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n\
\ \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666787,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.01271845661870176,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.01271845661870176\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5782296636685702,\n\
\ \"mc2_stderr\": 0.015300937426837897\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386788\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5003790750568613,\n \
\ \"acc_stderr\": 0.01377248076162617\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/SpydazWeb_AI_BASE_128k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-49-44.739166.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-49-44.739166.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- '**/details_harness|winogrande|5_2024-03-21T21-49-44.739166.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T21-49-44.739166.parquet'
- config_name: results
data_files:
- split: 2024_03_21T21_49_44.739166
path:
- results_2024-03-21T21-49-44.739166.parquet
- split: latest
path:
- results_2024-03-21T21-49-44.739166.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/SpydazWeb_AI_BASE_128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/SpydazWeb_AI_BASE_128k](https://huggingface.co/LeroyDyer/SpydazWeb_AI_BASE_128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__SpydazWeb_AI_BASE_128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T21:49:44.739166](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__SpydazWeb_AI_BASE_128k/blob/main/results_2024-03-21T21-49-44.739166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.638316387744326,
"acc_stderr": 0.03233739079830125,
"acc_norm": 0.641820651359968,
"acc_norm_stderr": 0.03298269279574279,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5782296636685702,
"mc2_stderr": 0.015300937426837897
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910474,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179344
},
"harness|hellaswag|10": {
"acc": 0.6617207727544314,
"acc_stderr": 0.004721571443354415,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.0035997580435468074
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659356,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884403,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884403
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38994413407821227,
"acc_stderr": 0.01631237662921307,
"acc_norm": 0.38994413407821227,
"acc_norm_stderr": 0.01631237662921307
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.01271845661870176,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.01271845661870176
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5782296636685702,
"mc2_stderr": 0.015300937426837897
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386788
},
"harness|gsm8k|5": {
"acc": 0.5003790750568613,
"acc_stderr": 0.01377248076162617
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
laion/laion1B-nolang-watermark | Invalid username or password. |
Cohere/miracl-fr-queries-22-12 | ---
annotations_creators:
- expert-generated
language:
- fr
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (fr) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-fr-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-fr-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-fr-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-fr-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-fr-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-fr-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-fr-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
peldrak/riviera | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pixel_values
dtype: image
splits:
- name: train
num_bytes: 57352928.0
num_examples: 382
download_size: 57347880
dataset_size: 57352928.0
---
# Dataset Card for "riviera"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adxtya/mplug_test | ---
license: mit
---
|
Sheokedech/id_instructions-id-small | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 0
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "id_instructions-id-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nataliaElv/textclass_descriptives_vectors | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for textclass_descriptives_vectors
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/textclass_descriptives_vectors")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/textclass_descriptives_vectors")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | text | True | True |
| context | Context | text | False | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| class | Classify the instruction according to its class | label_selection | True | N/A | ['closed_qa', 'classification', 'open_qa', 'information_extraction', 'brainstorming', 'general_qa', 'summarization', 'creative_writing'] |
| response | Response | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
**✨ NEW** The **vectors** are different columns that contain a vector in floating point, which is constraint to the pre-defined dimensions in the **vectors_settings** when configuring the vectors within the dataset itself, also the dimensions will always be 1-dimensional. The **vectors** are optional and identified by the pre-defined vector name in the dataset configuration file in `argilla.yaml`.
| Vector Name | Title | Dimensions |
|-------------|-------|------------|
| prompt-similarity | prompt-similarity | [1, 768] |
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| n_characters | No. of characters | integer | None - None | True |
| passed_quality_check | Passed quality check? | terms | - | True |
| flesch_reading_ease | Reading ease | float | None - None | True |
| entropy | Entropy | float | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "Can brain cells move? By movement I mean long distance migration (preferably within the brain only)."
},
"metadata": {
"entropy": 0.4352176404374839,
"flesch_reading_ease": 82.39000000000001,
"n_characters": 85,
"passed_quality_check": "True"
},
"responses": [],
"suggestions": [
{
"agent": "setfit",
"question_name": "class",
"score": 0.2823514938354492,
"type": null,
"value": "brainstorming"
},
{
"agent": null,
"question_name": "response",
"score": null,
"type": null,
"value": "The question is relatively broad and one should take into account that the brain not only consists of neurons, but also glial cells (supportive cells) and pre-mitotic neuronal stem cells. Furthermore, as critical fellow-scientists have indicated, developmental stage is very important, as the developing embryonic brain is very different from the adult brain.\nHowever, after sifting through various publications, the answer to the question is actually remarkably simple: Yes, brain cells migrate.\nIn the adult brain glial cells migrate in the brain (Kl\u00e4mbt, 2009). Glial cells are involved in a myriad of functions, but a notable example of migrating glial cells are the oligodendrocytes that migrate relative long distances to find their target axons onto which they wrap themselves to form the insulating myelin sheath (Tsai and Miller, 2002).\nNeuronal stem cells migrate over long distances in response to injury (Imitola et al., 2004) and they migrate from specific stem-cell locations (e.g., hippocampus and subventricular zone) to other regions (Clarke, 2003).\nPost-mitotic, but non-differentiated neurons have been shown to migrate in the adult brain in fish (Scott et al., 2012), and in mammals and non-human primates as well (Sawada et al., 2011).\nNot surprisingly, glial cells, stem cells and neurons also migrate during embryonic development. Most notably, post-mitotic neurons destined to fulfill peripheral functions have to migrate over relatively long distances from the neural crest to their target locations (Neuroscience, 2nd ed, Neuronal Migration)."
}
],
"vectors": {
"prompt-similarity": [
-0.013013245537877083,
0.01881960965692997,
0.018717532977461815,
-0.014981311745941639,
0.03672853484749794,
-0.015297300182282925,
0.031154541298747063,
0.009528533555567265,
-0.031607501208782196,
-0.039829764515161514,
-0.019534926861524582,
-0.019294919446110725,
-0.047140125185251236,
0.03812485188245773,
-0.018894944339990616,
0.039123568683862686,
0.03436238318681717,
-0.007996739819645882,
0.013651853427290916,
-0.016834214329719543,
-0.02929615043103695,
0.002512674080207944,
0.008257705718278885,
0.03932825103402138,
0.031019780784845352,
-0.028575727716088295,
-0.022710563614964485,
0.0132012739777565,
-0.048433348536491394,
-0.02651829645037651,
0.01601981930434704,
-0.006484998855739832,
-0.07150214165449142,
-0.010764969512820244,
0.00407565338537097,
-0.007564086001366377,
-0.015640858560800552,
-0.012789258733391762,
0.00717244204133749,
-0.051655009388923645,
-0.030335327610373497,
0.007193537428975105,
-0.020686019212007523,
0.016904372721910477,
-0.057382386177778244,
0.020192697644233704,
-0.0621950700879097,
0.0034242896363139153,
-0.04375811666250229,
-0.012516515329480171,
-0.04787379130721092,
0.05757446959614754,
0.045590516179800034,
-0.019442711025476456,
0.02614322304725647,
0.022066324949264526,
-0.017174094915390015,
-0.03904383257031441,
-0.014966102316975594,
-0.04261021316051483,
0.06123539060354233,
0.01483749970793724,
-0.009737796150147915,
-0.021765291690826416,
-0.001423536567017436,
-0.04854138195514679,
0.03245295211672783,
0.02051699534058571,
-0.05414895340800285,
-0.03563692420721054,
-0.0506395623087883,
-0.06071240082383156,
-0.017511913552880287,
0.006278000771999359,
0.009547360241413116,
-0.05603624880313873,
-0.0038324843626469374,
0.012652688659727573,
0.06399084627628326,
0.01680467091500759,
0.030588308349251747,
0.023556867614388466,
-0.04122614115476608,
0.06281794607639313,
0.002343484666198492,
-0.03668874129652977,
-0.01711929589509964,
-4.190538675175048e-06,
-0.05742541700601578,
0.04727115109562874,
-0.04583971947431564,
-0.01956474594771862,
0.02877974882721901,
0.05513108894228935,
0.015185099095106125,
-0.006118557415902615,
0.0272984616458416,
-0.02677239291369915,
-0.009623365476727486,
0.05534995347261429,
-0.02598058618605137,
-0.04715755954384804,
-0.022215673699975014,
-0.009219354949891567,
-0.05435849353671074,
-0.03680011257529259,
-0.008128424175083637,
-0.029657825827598572,
0.022026637569069862,
-0.012166539207100868,
-0.025011586025357246,
-0.02193683199584484,
-0.00693196477368474,
0.006336281541734934,
-0.043086495250463486,
0.05915242061018944,
0.02211538702249527,
-0.023119445890188217,
0.007697188761085272,
-0.0552712008357048,
0.03299417346715927,
0.05157257989048958,
-0.03600669652223587,
0.044204846024513245,
0.025432858616113663,
0.007447212003171444,
0.006279517896473408,
0.03376108407974243,
-0.040294621139764786,
-0.058066226541996,
0.012761987745761871,
0.04904710873961449,
-0.012213962152600288,
-0.013692168518900871,
0.027355555444955826,
-0.0023957074154168367,
0.028188826516270638,
-0.027611739933490753,
0.029400011524558067,
0.0013150176964700222,
0.0362129732966423,
0.012163455598056316,
0.03474310413002968,
-0.007054436486214399,
0.02536170184612274,
-0.07868500053882599,
-0.04395574703812599,
-0.04243417829275131,
0.002584034577012062,
-0.0005564193706959486,
-0.019545502960681915,
0.05276765301823616,
0.0394630953669548,
-0.057229649275541306,
-0.01710808463394642,
0.05301479622721672,
-0.03010011836886406,
0.03373352438211441,
-0.04287588968873024,
-0.006589761935174465,
0.02951083518564701,
-0.019792240113019943,
0.012560124509036541,
-0.022978615015745163,
-0.01804402843117714,
-0.01765276864171028,
0.050604935735464096,
-0.031133880838751793,
-0.03520930930972099,
0.06622219830751419,
-0.04686705023050308,
0.01252678595483303,
0.06677322834730148,
0.0012780202087014914,
-0.007755340542644262,
-0.002916350495070219,
0.062082815915346146,
-0.003067526500672102,
0.006080616265535355,
-0.036430295556783676,
-0.06199180707335472,
0.02642948180437088,
-0.00425749970600009,
0.025306515395641327,
-0.0014685469213873148,
-0.028660226613283157,
0.052989762276411057,
-0.01557255256921053,
0.009855816140770912,
-0.0121422428637743,
-0.03747929632663727,
-0.08137062191963196,
0.007190469186753035,
0.011331912130117416,
0.06765188276767731,
-0.022611519321799278,
-0.02787146158516407,
0.05748944729566574,
0.00487024150788784,
0.039478056132793427,
0.01931411400437355,
0.013803835026919842,
0.04888024553656578,
-0.037333935499191284,
-0.027693377807736397,
0.059805672615766525,
0.03614082559943199,
0.005785312503576279,
0.013619908131659031,
0.05161786451935768,
-0.00884980708360672,
0.010016173124313354,
0.042678751051425934,
-0.027733702212572098,
0.027968743816018105,
-0.037427231669425964,
-0.002935838419944048,
-0.01202351227402687,
0.006725606042891741,
-0.07508431375026703,
-0.0060306512750685215,
0.008263292722404003,
-0.025336965918540955,
0.04014277085661888,
0.008093785494565964,
0.08171582221984863,
0.07616759836673737,
-0.0771564468741417,
0.022446291521191597,
0.008821032010018826,
0.013829128816723824,
0.02364560402929783,
-0.0022572220768779516,
0.03746487572789192,
-0.005879886448383331,
0.008362085558474064,
-0.013305987231433392,
-0.06773458421230316,
0.047247979789972305,
-0.054940834641456604,
0.006651178002357483,
0.04406357184052467,
0.0032514971680939198,
0.06607890874147415,
-0.023339349776506424,
-0.015506909228861332,
0.056580446660518646,
-0.013175010681152344,
-0.009680991992354393,
0.003048372222110629,
-0.02173807844519615,
-0.03575072064995766,
0.0034152292646467686,
0.0023930943571031094,
0.032616451382637024,
-0.08494752645492554,
-0.04464119300246239,
-0.008594084531068802,
0.07189679890871048,
0.039310749620199203,
-0.0032280997838824987,
0.0571722686290741,
0.031821854412555695,
-0.018074551597237587,
-0.05658836290240288,
-0.10419323295354843,
-0.038979772478342056,
-0.004710170906037092,
0.06021471694111824,
0.02279377542436123,
0.06624987721443176,
-0.0021200855262577534,
0.02761155366897583,
9.02639476407785e-06,
-0.021869199350476265,
0.024204667657613754,
0.06580100208520889,
0.002844455884769559,
-0.01991298981010914,
-0.0200088731944561,
0.02950236387550831,
0.06952787935733795,
-0.017109204083681107,
-0.029190661385655403,
0.022067055106163025,
-0.05215190351009369,
-0.002498551970347762,
-0.003893302520737052,
-0.004048035945743322,
0.044902484863996506,
0.01182111818343401,
0.014091513119637966,
0.007183252368122339,
0.035346873104572296,
-0.005363106727600098,
0.05331592261791229,
0.04623641446232796,
-0.01476075779646635,
-0.010740607045590878,
-0.019701674580574036,
0.00595542136579752,
0.03692961856722832,
0.012378417886793613,
-0.022257760167121887,
0.003160405671223998,
-1.8131876231564092e-06,
-0.017647042870521545,
-0.03700786456465721,
-0.24109095335006714,
0.006522865034639835,
-0.0008469457970932126,
-0.03644183278083801,
0.017320087179541588,
0.01328502781689167,
0.003192389849573374,
-0.028336772695183754,
-0.03504892438650131,
-0.0014239358715713024,
-0.03514610975980759,
0.022008158266544342,
-0.011342125944793224,
0.05192045867443085,
0.03085877001285553,
-0.025241609662771225,
0.0237770676612854,
-0.05109399929642677,
-0.010781534016132355,
0.0020606154575943947,
-0.04335577413439751,
-0.028212837874889374,
0.0002747350081335753,
0.046457286924123764,
0.010325346142053604,
0.08826259523630142,
-0.043199118226766586,
-0.010338421911001205,
-0.06027568131685257,
0.009151126258075237,
-0.01782579906284809,
-0.027093859389424324,
0.007199855055660009,
-0.019019782543182373,
0.022030359134078026,
-0.010693224146962166,
0.0009507028153166175,
-0.026087958365678787,
0.024485325440764427,
-0.04338093847036362,
-0.04680050536990166,
-0.03561573103070259,
-0.02055582031607628,
0.0038633362855762243,
0.06559355556964874,
-0.023061249405145645,
-0.017895730212330818,
0.0038954829797148705,
0.008263446390628815,
0.04940579831600189,
-0.008470145985484123,
-0.0014497878728434443,
-0.0061887046322226524,
0.03428115323185921,
-0.0007602313999086618,
-0.009981812909245491,
0.027376258745789528,
0.026810050010681152,
-0.03568948805332184,
-0.0058975000865757465,
0.02460271678864956,
-0.01275318767875433,
-0.03641323372721672,
-0.044666923582553864,
0.029698815196752548,
-0.03262021392583847,
-0.02356722205877304,
-0.04117002710700035,
0.0848817452788353,
-0.004286558832973242,
-0.018582580611109734,
0.013618958182632923,
-0.03509534150362015,
-0.06519659608602524,
0.028257008641958237,
0.021286210045218468,
-0.06835642457008362,
-0.054849766194820404,
-0.01941634714603424,
0.035323113203048706,
-0.025973310694098473,
0.002146123442798853,
0.026771889999508858,
0.05470979958772659,
-0.03781023249030113,
-0.04531051591038704,
0.012180115096271038,
0.0009777187369763851,
-0.0416688397526741,
-0.013594291172921658,
0.09633821249008179,
0.00042126362677663565,
0.02082621492445469,
-0.011436634697020054,
0.052587978541851044,
0.04485282301902771,
-0.011207791976630688,
-0.028182996436953545,
0.028562700375914574,
-0.0452943854033947,
0.06573814153671265,
-0.04766593873500824,
0.029138406738638878,
-0.014932483434677124,
0.012515360489487648,
-0.008935957215726376,
-0.05353805422782898,
0.026841312646865845,
0.03796624764800072,
0.012656201608479023,
0.03330421447753906,
0.011739440262317657,
0.030942635610699654,
-0.04102332144975662,
0.015347322449088097,
-0.05560077726840973,
0.008390153758227825,
0.07054135203361511,
0.028721380978822708,
0.0028039051685482264,
-0.020784109830856323,
0.009438532404601574,
-0.0605308897793293,
-0.01866653747856617,
-0.06967351585626602,
0.03392767161130905,
0.006826978642493486,
0.025683172047138214,
-0.0034906533546745777,
0.029044777154922485,
-0.015162697061896324,
0.0038685882464051247,
0.0499376617372036,
0.02318284660577774,
0.010678326711058617,
-0.014715512283146381,
-0.042784977704286575,
-0.002209000289440155,
-0.014008396305143833,
-0.028120383620262146,
0.0026574472431093454,
0.030087493360042572,
0.03461616113781929,
0.03625616058707237,
-0.011008461937308311,
0.043217092752456665,
-0.045464660972356796,
0.022507434710860252,
-0.02420778200030327,
-0.002824041061103344,
0.028755616396665573,
-0.04187369719147682,
-0.015139559283852577,
-0.053725019097328186,
-0.025201475247740746,
-0.012609651312232018,
0.04252387210726738,
0.02392260916531086,
0.016753822565078735,
-0.03215314820408821,
-0.01936139352619648,
-0.046136122196912766,
-0.005073823034763336,
0.008640735410153866,
-0.009679833427071571,
0.07807573676109314,
-0.012567133642733097,
-0.031146127730607986,
-0.026593416929244995,
0.026098934933543205,
0.024264968931674957,
-0.0075249760411679745,
-0.06842546164989471,
0.03510553762316704,
-0.006868013646453619,
0.01947402022778988,
-0.029724987223744392,
-0.03539305925369263,
0.028799021616578102,
0.030593188479542732,
0.03373757004737854,
-0.028323186561465263,
-0.005245779640972614,
0.0025080086197704077,
0.06109020859003067,
-0.0414900928735733,
0.05396903306245804,
-0.047728512436151505,
-0.017351394519209862,
0.02362070232629776,
-0.007311966270208359,
0.028682058677077293,
-0.014722640626132488,
-0.007481182459741831,
-0.035072099417448044,
-0.021136067807674408,
0.019015248864889145,
0.008854486048221588,
-0.0005861225072294474,
-0.012599045410752296,
0.0175931416451931,
-0.04479547217488289,
-0.008386379107832909,
0.03618542104959488,
0.01628889888525009,
-0.08031677454710007,
0.039770182222127914,
0.041299525648355484,
-0.008586069568991661,
0.038849104195833206,
-0.019013259559869766,
0.015810709446668625,
-0.026148298755288124,
0.03409867733716965,
0.012881561182439327,
0.0007065649842843413,
-0.010571092367172241,
-0.04538531228899956,
-0.005888957995921373,
0.010284706018865108,
-0.00910396408289671,
0.0024551369715481997,
-0.028111808001995087,
-0.056267447769641876,
-0.03570198640227318,
0.0007470435812138021,
-0.03200932964682579,
3.1971394491847605e-05,
0.07073836773633957,
-0.025731729343533516,
0.016087668016552925,
-0.019969554618000984,
-0.02380352094769478,
0.07783369719982147,
-0.0077037508599460125,
-0.026075275614857674,
0.03502178564667702,
-0.005804023705422878,
-0.015163084492087364,
0.06934002041816711,
0.0368470698595047,
0.017380570992827415,
-0.03955657035112381,
-0.028987567871809006,
0.027637561783194542,
0.04501322656869888,
-0.026961492374539375,
0.00020521112310234457,
-0.0452781617641449,
0.049811046570539474,
0.028363030403852463,
0.004181100055575371,
0.0021030332427471876,
-0.015064270235598087,
0.05535869300365448,
-0.029472526162862778,
-0.04478950425982475,
0.0027753578033298254,
-0.004514075815677643,
-0.023607026785612106,
0.023749861866235733,
0.01957106776535511,
-0.024119185283780098,
-0.01694166287779808,
0.04224187880754471,
0.017501620575785637,
-0.004305294249206781,
0.018400326371192932,
0.044329140335321426,
-0.06549150496721268,
0.008912339806556702,
-0.03948299214243889,
-0.03004170022904873,
0.0032710819505155087,
-0.019911974668502808,
0.02723447047173977,
-0.022703979164361954,
0.034845732152462006,
0.05078149959445,
-0.06074056029319763,
-0.01075307372957468,
0.07076920568943024,
0.0021933179814368486,
-0.03962651267647743,
0.024789808318018913,
-0.07408491522073746,
0.0247175469994545,
-0.03231014311313629,
-0.02483881451189518,
0.002730102278292179,
0.037088677287101746,
-0.0033236793242394924,
0.005284950602799654,
0.014846455305814743,
0.03255154564976692,
0.02706083469092846,
0.049154844135046005,
0.06594257056713104,
-0.02415977232158184,
0.026963576674461365,
-0.07380963861942291,
0.06781016290187836,
0.018511293455958366,
-0.015869174152612686,
-0.038478851318359375,
0.0335836261510849,
0.02612367272377014,
-0.06550119817256927,
0.01825067587196827,
0.013035713694989681,
-0.008435440249741077,
-0.08638200908899307,
0.05963002145290375,
0.024324510246515274,
-0.02895611710846424,
-0.04167400300502777,
0.04319422319531441,
-0.05413385480642319,
0.015215273015201092,
0.03725837171077728,
-0.004908927250653505,
-0.002934563672170043,
0.041528936475515366,
0.012155082076787949,
0.04147651046514511,
0.05855671316385269,
-0.0299361739307642,
0.02512580342590809,
0.020929407328367233,
0.06349261105060577,
0.053939227014780045,
0.05713503807783127,
-0.0038927458226680756,
0.07881465554237366,
-0.012467852793633938,
-0.034171897917985916,
0.020261041820049286,
-0.0021278418134897947,
-0.002377619966864586,
0.004330282565206289,
0.012825283221900463,
0.04088682681322098,
0.008562165312469006,
0.0359053835272789,
-0.053358469158411026,
0.011921711266040802,
0.020781131461262703,
0.036604978144168854,
0.03237057104706764,
0.027678076177835464,
0.025395873934030533,
0.024215875193476677,
-0.02316826581954956,
-0.049021363258361816,
-0.005335877649486065,
-0.04324529692530632,
0.033709343522787094,
0.009520786814391613,
-0.06291788816452026,
0.016032546758651733,
-0.017273124307394028,
0.03564963862299919,
0.06645374745130539,
0.0019759878050535917,
0.04844486713409424,
-0.033923204988241196,
0.03365401178598404,
-0.03546270355582237,
0.017526622861623764,
0.05221246927976608,
0.027283355593681335,
0.00947093591094017,
-0.027012217789888382,
-0.001877183560281992,
0.016856137663125992,
0.013093618676066399,
0.025977004319429398,
-0.06342248618602753,
-0.002382427453994751,
0.02860536240041256,
0.05974981561303139,
-0.03283765912055969,
-0.04812508821487427,
-0.05995623767375946,
-0.037662360817193985,
-0.035185620188713074,
-0.01508689671754837,
0.035811878740787506,
-0.052011068910360336,
-0.059904687106609344,
-0.026118896901607513,
-0.010637863539159298,
-0.011021668091416359,
-0.03290007635951042,
-0.030089853331446648,
-0.03142952546477318,
0.04359989985823631,
0.040401678532361984,
0.02362644672393799,
0.013705096207559109,
0.08372753113508224,
-0.029495922848582268,
-0.06889309734106064,
0.00678789708763361,
-0.007068346720188856,
0.07379143685102463,
-0.02387312427163124,
-0.0024106407072395086,
-0.08333039283752441,
0.018529068678617477,
0.03415510058403015,
0.022234655916690826,
-0.10251957923173904,
0.036007318645715714,
-0.00660698814317584,
0.00572143355384469,
0.026509005576372147,
-0.011688550002872944,
-0.008342253975570202,
-0.04845166578888893,
-0.030434146523475647,
0.0014085661387071013,
-0.03824504837393761,
0.06172807887196541,
-0.03449011966586113,
0.07329946011304855,
0.029795274138450623,
0.026717940345406532,
-0.045109957456588745,
0.024327795952558517,
-0.008753367699682713,
0.01352944690734148,
-0.023602385073900223,
-0.036179229617118835,
-0.008612464182078838,
-0.12454637885093689,
-0.016345543786883354,
-0.012179647572338581,
-0.02734498679637909,
-0.05160606652498245,
0.019233766943216324,
-0.027092240750789642,
0.016395756974816322,
-0.012205400504171848,
-0.014156125485897064,
-0.04153557866811752,
-0.020725106820464134,
-0.03977225720882416,
-0.05970294773578644,
-0.0023274689447134733,
-0.0164078027009964,
-0.021304765716195107,
0.053715966641902924,
-0.017753545194864273,
0.010519351810216904,
0.004593766760081053,
-0.03116416372358799,
-0.027580147609114647,
0.0033015876542776823,
0.033720631152391434
]
}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"class": [],
"class-suggestion": "brainstorming",
"class-suggestion-metadata": {
"agent": "setfit",
"score": 0.2823514938354492,
"type": null
},
"context": null,
"external_id": null,
"metadata": "{\"n_characters\": 85, \"passed_quality_check\": \"True\", \"flesch_reading_ease\": 82.39000000000001, \"entropy\": 0.4352176404374839}",
"prompt": "Can brain cells move? By movement I mean long distance migration (preferably within the brain only).",
"response": [],
"response-suggestion": "The question is relatively broad and one should take into account that the brain not only consists of neurons, but also glial cells (supportive cells) and pre-mitotic neuronal stem cells. Furthermore, as critical fellow-scientists have indicated, developmental stage is very important, as the developing embryonic brain is very different from the adult brain.\nHowever, after sifting through various publications, the answer to the question is actually remarkably simple: Yes, brain cells migrate.\nIn the adult brain glial cells migrate in the brain (Kl\u00e4mbt, 2009). Glial cells are involved in a myriad of functions, but a notable example of migrating glial cells are the oligodendrocytes that migrate relative long distances to find their target axons onto which they wrap themselves to form the insulating myelin sheath (Tsai and Miller, 2002).\nNeuronal stem cells migrate over long distances in response to injury (Imitola et al., 2004) and they migrate from specific stem-cell locations (e.g., hippocampus and subventricular zone) to other regions (Clarke, 2003).\nPost-mitotic, but non-differentiated neurons have been shown to migrate in the adult brain in fish (Scott et al., 2012), and in mammals and non-human primates as well (Sawada et al., 2011).\nNot surprisingly, glial cells, stem cells and neurons also migrate during embryonic development. Most notably, post-mitotic neurons destined to fulfill peripheral functions have to migrate over relatively long distances from the neural crest to their target locations (Neuroscience, 2nd ed, Neuronal Migration).",
"response-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"vectors": {
"prompt-similarity": [
-0.013013245537877083,
0.01881960965692997,
0.018717532977461815,
-0.014981311745941639,
0.03672853484749794,
-0.015297300182282925,
0.031154541298747063,
0.009528533555567265,
-0.031607501208782196,
-0.039829764515161514,
-0.019534926861524582,
-0.019294919446110725,
-0.047140125185251236,
0.03812485188245773,
-0.018894944339990616,
0.039123568683862686,
0.03436238318681717,
-0.007996739819645882,
0.013651853427290916,
-0.016834214329719543,
-0.02929615043103695,
0.002512674080207944,
0.008257705718278885,
0.03932825103402138,
0.031019780784845352,
-0.028575727716088295,
-0.022710563614964485,
0.0132012739777565,
-0.048433348536491394,
-0.02651829645037651,
0.01601981930434704,
-0.006484998855739832,
-0.07150214165449142,
-0.010764969512820244,
0.00407565338537097,
-0.007564086001366377,
-0.015640858560800552,
-0.012789258733391762,
0.00717244204133749,
-0.051655009388923645,
-0.030335327610373497,
0.007193537428975105,
-0.020686019212007523,
0.016904372721910477,
-0.057382386177778244,
0.020192697644233704,
-0.0621950700879097,
0.0034242896363139153,
-0.04375811666250229,
-0.012516515329480171,
-0.04787379130721092,
0.05757446959614754,
0.045590516179800034,
-0.019442711025476456,
0.02614322304725647,
0.022066324949264526,
-0.017174094915390015,
-0.03904383257031441,
-0.014966102316975594,
-0.04261021316051483,
0.06123539060354233,
0.01483749970793724,
-0.009737796150147915,
-0.021765291690826416,
-0.001423536567017436,
-0.04854138195514679,
0.03245295211672783,
0.02051699534058571,
-0.05414895340800285,
-0.03563692420721054,
-0.0506395623087883,
-0.06071240082383156,
-0.017511913552880287,
0.006278000771999359,
0.009547360241413116,
-0.05603624880313873,
-0.0038324843626469374,
0.012652688659727573,
0.06399084627628326,
0.01680467091500759,
0.030588308349251747,
0.023556867614388466,
-0.04122614115476608,
0.06281794607639313,
0.002343484666198492,
-0.03668874129652977,
-0.01711929589509964,
-4.190538675175048e-06,
-0.05742541700601578,
0.04727115109562874,
-0.04583971947431564,
-0.01956474594771862,
0.02877974882721901,
0.05513108894228935,
0.015185099095106125,
-0.006118557415902615,
0.0272984616458416,
-0.02677239291369915,
-0.009623365476727486,
0.05534995347261429,
-0.02598058618605137,
-0.04715755954384804,
-0.022215673699975014,
-0.009219354949891567,
-0.05435849353671074,
-0.03680011257529259,
-0.008128424175083637,
-0.029657825827598572,
0.022026637569069862,
-0.012166539207100868,
-0.025011586025357246,
-0.02193683199584484,
-0.00693196477368474,
0.006336281541734934,
-0.043086495250463486,
0.05915242061018944,
0.02211538702249527,
-0.023119445890188217,
0.007697188761085272,
-0.0552712008357048,
0.03299417346715927,
0.05157257989048958,
-0.03600669652223587,
0.044204846024513245,
0.025432858616113663,
0.007447212003171444,
0.006279517896473408,
0.03376108407974243,
-0.040294621139764786,
-0.058066226541996,
0.012761987745761871,
0.04904710873961449,
-0.012213962152600288,
-0.013692168518900871,
0.027355555444955826,
-0.0023957074154168367,
0.028188826516270638,
-0.027611739933490753,
0.029400011524558067,
0.0013150176964700222,
0.0362129732966423,
0.012163455598056316,
0.03474310413002968,
-0.007054436486214399,
0.02536170184612274,
-0.07868500053882599,
-0.04395574703812599,
-0.04243417829275131,
0.002584034577012062,
-0.0005564193706959486,
-0.019545502960681915,
0.05276765301823616,
0.0394630953669548,
-0.057229649275541306,
-0.01710808463394642,
0.05301479622721672,
-0.03010011836886406,
0.03373352438211441,
-0.04287588968873024,
-0.006589761935174465,
0.02951083518564701,
-0.019792240113019943,
0.012560124509036541,
-0.022978615015745163,
-0.01804402843117714,
-0.01765276864171028,
0.050604935735464096,
-0.031133880838751793,
-0.03520930930972099,
0.06622219830751419,
-0.04686705023050308,
0.01252678595483303,
0.06677322834730148,
0.0012780202087014914,
-0.007755340542644262,
-0.002916350495070219,
0.062082815915346146,
-0.003067526500672102,
0.006080616265535355,
-0.036430295556783676,
-0.06199180707335472,
0.02642948180437088,
-0.00425749970600009,
0.025306515395641327,
-0.0014685469213873148,
-0.028660226613283157,
0.052989762276411057,
-0.01557255256921053,
0.009855816140770912,
-0.0121422428637743,
-0.03747929632663727,
-0.08137062191963196,
0.007190469186753035,
0.011331912130117416,
0.06765188276767731,
-0.022611519321799278,
-0.02787146158516407,
0.05748944729566574,
0.00487024150788784,
0.039478056132793427,
0.01931411400437355,
0.013803835026919842,
0.04888024553656578,
-0.037333935499191284,
-0.027693377807736397,
0.059805672615766525,
0.03614082559943199,
0.005785312503576279,
0.013619908131659031,
0.05161786451935768,
-0.00884980708360672,
0.010016173124313354,
0.042678751051425934,
-0.027733702212572098,
0.027968743816018105,
-0.037427231669425964,
-0.002935838419944048,
-0.01202351227402687,
0.006725606042891741,
-0.07508431375026703,
-0.0060306512750685215,
0.008263292722404003,
-0.025336965918540955,
0.04014277085661888,
0.008093785494565964,
0.08171582221984863,
0.07616759836673737,
-0.0771564468741417,
0.022446291521191597,
0.008821032010018826,
0.013829128816723824,
0.02364560402929783,
-0.0022572220768779516,
0.03746487572789192,
-0.005879886448383331,
0.008362085558474064,
-0.013305987231433392,
-0.06773458421230316,
0.047247979789972305,
-0.054940834641456604,
0.006651178002357483,
0.04406357184052467,
0.0032514971680939198,
0.06607890874147415,
-0.023339349776506424,
-0.015506909228861332,
0.056580446660518646,
-0.013175010681152344,
-0.009680991992354393,
0.003048372222110629,
-0.02173807844519615,
-0.03575072064995766,
0.0034152292646467686,
0.0023930943571031094,
0.032616451382637024,
-0.08494752645492554,
-0.04464119300246239,
-0.008594084531068802,
0.07189679890871048,
0.039310749620199203,
-0.0032280997838824987,
0.0571722686290741,
0.031821854412555695,
-0.018074551597237587,
-0.05658836290240288,
-0.10419323295354843,
-0.038979772478342056,
-0.004710170906037092,
0.06021471694111824,
0.02279377542436123,
0.06624987721443176,
-0.0021200855262577534,
0.02761155366897583,
9.02639476407785e-06,
-0.021869199350476265,
0.024204667657613754,
0.06580100208520889,
0.002844455884769559,
-0.01991298981010914,
-0.0200088731944561,
0.02950236387550831,
0.06952787935733795,
-0.017109204083681107,
-0.029190661385655403,
0.022067055106163025,
-0.05215190351009369,
-0.002498551970347762,
-0.003893302520737052,
-0.004048035945743322,
0.044902484863996506,
0.01182111818343401,
0.014091513119637966,
0.007183252368122339,
0.035346873104572296,
-0.005363106727600098,
0.05331592261791229,
0.04623641446232796,
-0.01476075779646635,
-0.010740607045590878,
-0.019701674580574036,
0.00595542136579752,
0.03692961856722832,
0.012378417886793613,
-0.022257760167121887,
0.003160405671223998,
-1.8131876231564092e-06,
-0.017647042870521545,
-0.03700786456465721,
-0.24109095335006714,
0.006522865034639835,
-0.0008469457970932126,
-0.03644183278083801,
0.017320087179541588,
0.01328502781689167,
0.003192389849573374,
-0.028336772695183754,
-0.03504892438650131,
-0.0014239358715713024,
-0.03514610975980759,
0.022008158266544342,
-0.011342125944793224,
0.05192045867443085,
0.03085877001285553,
-0.025241609662771225,
0.0237770676612854,
-0.05109399929642677,
-0.010781534016132355,
0.0020606154575943947,
-0.04335577413439751,
-0.028212837874889374,
0.0002747350081335753,
0.046457286924123764,
0.010325346142053604,
0.08826259523630142,
-0.043199118226766586,
-0.010338421911001205,
-0.06027568131685257,
0.009151126258075237,
-0.01782579906284809,
-0.027093859389424324,
0.007199855055660009,
-0.019019782543182373,
0.022030359134078026,
-0.010693224146962166,
0.0009507028153166175,
-0.026087958365678787,
0.024485325440764427,
-0.04338093847036362,
-0.04680050536990166,
-0.03561573103070259,
-0.02055582031607628,
0.0038633362855762243,
0.06559355556964874,
-0.023061249405145645,
-0.017895730212330818,
0.0038954829797148705,
0.008263446390628815,
0.04940579831600189,
-0.008470145985484123,
-0.0014497878728434443,
-0.0061887046322226524,
0.03428115323185921,
-0.0007602313999086618,
-0.009981812909245491,
0.027376258745789528,
0.026810050010681152,
-0.03568948805332184,
-0.0058975000865757465,
0.02460271678864956,
-0.01275318767875433,
-0.03641323372721672,
-0.044666923582553864,
0.029698815196752548,
-0.03262021392583847,
-0.02356722205877304,
-0.04117002710700035,
0.0848817452788353,
-0.004286558832973242,
-0.018582580611109734,
0.013618958182632923,
-0.03509534150362015,
-0.06519659608602524,
0.028257008641958237,
0.021286210045218468,
-0.06835642457008362,
-0.054849766194820404,
-0.01941634714603424,
0.035323113203048706,
-0.025973310694098473,
0.002146123442798853,
0.026771889999508858,
0.05470979958772659,
-0.03781023249030113,
-0.04531051591038704,
0.012180115096271038,
0.0009777187369763851,
-0.0416688397526741,
-0.013594291172921658,
0.09633821249008179,
0.00042126362677663565,
0.02082621492445469,
-0.011436634697020054,
0.052587978541851044,
0.04485282301902771,
-0.011207791976630688,
-0.028182996436953545,
0.028562700375914574,
-0.0452943854033947,
0.06573814153671265,
-0.04766593873500824,
0.029138406738638878,
-0.014932483434677124,
0.012515360489487648,
-0.008935957215726376,
-0.05353805422782898,
0.026841312646865845,
0.03796624764800072,
0.012656201608479023,
0.03330421447753906,
0.011739440262317657,
0.030942635610699654,
-0.04102332144975662,
0.015347322449088097,
-0.05560077726840973,
0.008390153758227825,
0.07054135203361511,
0.028721380978822708,
0.0028039051685482264,
-0.020784109830856323,
0.009438532404601574,
-0.0605308897793293,
-0.01866653747856617,
-0.06967351585626602,
0.03392767161130905,
0.006826978642493486,
0.025683172047138214,
-0.0034906533546745777,
0.029044777154922485,
-0.015162697061896324,
0.0038685882464051247,
0.0499376617372036,
0.02318284660577774,
0.010678326711058617,
-0.014715512283146381,
-0.042784977704286575,
-0.002209000289440155,
-0.014008396305143833,
-0.028120383620262146,
0.0026574472431093454,
0.030087493360042572,
0.03461616113781929,
0.03625616058707237,
-0.011008461937308311,
0.043217092752456665,
-0.045464660972356796,
0.022507434710860252,
-0.02420778200030327,
-0.002824041061103344,
0.028755616396665573,
-0.04187369719147682,
-0.015139559283852577,
-0.053725019097328186,
-0.025201475247740746,
-0.012609651312232018,
0.04252387210726738,
0.02392260916531086,
0.016753822565078735,
-0.03215314820408821,
-0.01936139352619648,
-0.046136122196912766,
-0.005073823034763336,
0.008640735410153866,
-0.009679833427071571,
0.07807573676109314,
-0.012567133642733097,
-0.031146127730607986,
-0.026593416929244995,
0.026098934933543205,
0.024264968931674957,
-0.0075249760411679745,
-0.06842546164989471,
0.03510553762316704,
-0.006868013646453619,
0.01947402022778988,
-0.029724987223744392,
-0.03539305925369263,
0.028799021616578102,
0.030593188479542732,
0.03373757004737854,
-0.028323186561465263,
-0.005245779640972614,
0.0025080086197704077,
0.06109020859003067,
-0.0414900928735733,
0.05396903306245804,
-0.047728512436151505,
-0.017351394519209862,
0.02362070232629776,
-0.007311966270208359,
0.028682058677077293,
-0.014722640626132488,
-0.007481182459741831,
-0.035072099417448044,
-0.021136067807674408,
0.019015248864889145,
0.008854486048221588,
-0.0005861225072294474,
-0.012599045410752296,
0.0175931416451931,
-0.04479547217488289,
-0.008386379107832909,
0.03618542104959488,
0.01628889888525009,
-0.08031677454710007,
0.039770182222127914,
0.041299525648355484,
-0.008586069568991661,
0.038849104195833206,
-0.019013259559869766,
0.015810709446668625,
-0.026148298755288124,
0.03409867733716965,
0.012881561182439327,
0.0007065649842843413,
-0.010571092367172241,
-0.04538531228899956,
-0.005888957995921373,
0.010284706018865108,
-0.00910396408289671,
0.0024551369715481997,
-0.028111808001995087,
-0.056267447769641876,
-0.03570198640227318,
0.0007470435812138021,
-0.03200932964682579,
3.1971394491847605e-05,
0.07073836773633957,
-0.025731729343533516,
0.016087668016552925,
-0.019969554618000984,
-0.02380352094769478,
0.07783369719982147,
-0.0077037508599460125,
-0.026075275614857674,
0.03502178564667702,
-0.005804023705422878,
-0.015163084492087364,
0.06934002041816711,
0.0368470698595047,
0.017380570992827415,
-0.03955657035112381,
-0.028987567871809006,
0.027637561783194542,
0.04501322656869888,
-0.026961492374539375,
0.00020521112310234457,
-0.0452781617641449,
0.049811046570539474,
0.028363030403852463,
0.004181100055575371,
0.0021030332427471876,
-0.015064270235598087,
0.05535869300365448,
-0.029472526162862778,
-0.04478950425982475,
0.0027753578033298254,
-0.004514075815677643,
-0.023607026785612106,
0.023749861866235733,
0.01957106776535511,
-0.024119185283780098,
-0.01694166287779808,
0.04224187880754471,
0.017501620575785637,
-0.004305294249206781,
0.018400326371192932,
0.044329140335321426,
-0.06549150496721268,
0.008912339806556702,
-0.03948299214243889,
-0.03004170022904873,
0.0032710819505155087,
-0.019911974668502808,
0.02723447047173977,
-0.022703979164361954,
0.034845732152462006,
0.05078149959445,
-0.06074056029319763,
-0.01075307372957468,
0.07076920568943024,
0.0021933179814368486,
-0.03962651267647743,
0.024789808318018913,
-0.07408491522073746,
0.0247175469994545,
-0.03231014311313629,
-0.02483881451189518,
0.002730102278292179,
0.037088677287101746,
-0.0033236793242394924,
0.005284950602799654,
0.014846455305814743,
0.03255154564976692,
0.02706083469092846,
0.049154844135046005,
0.06594257056713104,
-0.02415977232158184,
0.026963576674461365,
-0.07380963861942291,
0.06781016290187836,
0.018511293455958366,
-0.015869174152612686,
-0.038478851318359375,
0.0335836261510849,
0.02612367272377014,
-0.06550119817256927,
0.01825067587196827,
0.013035713694989681,
-0.008435440249741077,
-0.08638200908899307,
0.05963002145290375,
0.024324510246515274,
-0.02895611710846424,
-0.04167400300502777,
0.04319422319531441,
-0.05413385480642319,
0.015215273015201092,
0.03725837171077728,
-0.004908927250653505,
-0.002934563672170043,
0.041528936475515366,
0.012155082076787949,
0.04147651046514511,
0.05855671316385269,
-0.0299361739307642,
0.02512580342590809,
0.020929407328367233,
0.06349261105060577,
0.053939227014780045,
0.05713503807783127,
-0.0038927458226680756,
0.07881465554237366,
-0.012467852793633938,
-0.034171897917985916,
0.020261041820049286,
-0.0021278418134897947,
-0.002377619966864586,
0.004330282565206289,
0.012825283221900463,
0.04088682681322098,
0.008562165312469006,
0.0359053835272789,
-0.053358469158411026,
0.011921711266040802,
0.020781131461262703,
0.036604978144168854,
0.03237057104706764,
0.027678076177835464,
0.025395873934030533,
0.024215875193476677,
-0.02316826581954956,
-0.049021363258361816,
-0.005335877649486065,
-0.04324529692530632,
0.033709343522787094,
0.009520786814391613,
-0.06291788816452026,
0.016032546758651733,
-0.017273124307394028,
0.03564963862299919,
0.06645374745130539,
0.0019759878050535917,
0.04844486713409424,
-0.033923204988241196,
0.03365401178598404,
-0.03546270355582237,
0.017526622861623764,
0.05221246927976608,
0.027283355593681335,
0.00947093591094017,
-0.027012217789888382,
-0.001877183560281992,
0.016856137663125992,
0.013093618676066399,
0.025977004319429398,
-0.06342248618602753,
-0.002382427453994751,
0.02860536240041256,
0.05974981561303139,
-0.03283765912055969,
-0.04812508821487427,
-0.05995623767375946,
-0.037662360817193985,
-0.035185620188713074,
-0.01508689671754837,
0.035811878740787506,
-0.052011068910360336,
-0.059904687106609344,
-0.026118896901607513,
-0.010637863539159298,
-0.011021668091416359,
-0.03290007635951042,
-0.030089853331446648,
-0.03142952546477318,
0.04359989985823631,
0.040401678532361984,
0.02362644672393799,
0.013705096207559109,
0.08372753113508224,
-0.029495922848582268,
-0.06889309734106064,
0.00678789708763361,
-0.007068346720188856,
0.07379143685102463,
-0.02387312427163124,
-0.0024106407072395086,
-0.08333039283752441,
0.018529068678617477,
0.03415510058403015,
0.022234655916690826,
-0.10251957923173904,
0.036007318645715714,
-0.00660698814317584,
0.00572143355384469,
0.026509005576372147,
-0.011688550002872944,
-0.008342253975570202,
-0.04845166578888893,
-0.030434146523475647,
0.0014085661387071013,
-0.03824504837393761,
0.06172807887196541,
-0.03449011966586113,
0.07329946011304855,
0.029795274138450623,
0.026717940345406532,
-0.045109957456588745,
0.024327795952558517,
-0.008753367699682713,
0.01352944690734148,
-0.023602385073900223,
-0.036179229617118835,
-0.008612464182078838,
-0.12454637885093689,
-0.016345543786883354,
-0.012179647572338581,
-0.02734498679637909,
-0.05160606652498245,
0.019233766943216324,
-0.027092240750789642,
0.016395756974816322,
-0.012205400504171848,
-0.014156125485897064,
-0.04153557866811752,
-0.020725106820464134,
-0.03977225720882416,
-0.05970294773578644,
-0.0023274689447134733,
-0.0164078027009964,
-0.021304765716195107,
0.053715966641902924,
-0.017753545194864273,
0.010519351810216904,
0.004593766760081053,
-0.03116416372358799,
-0.027580147609114647,
0.0033015876542776823,
0.033720631152391434
]
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* (optional) **context** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **class** is of type `label_selection` with the following allowed values ['closed_qa', 'classification', 'open_qa', 'information_extraction', 'brainstorming', 'general_qa', 'summarization', 'creative_writing'].
* **response** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **class-suggestion** is of type `label_selection` with the following allowed values ['closed_qa', 'classification', 'open_qa', 'information_extraction', 'brainstorming', 'general_qa', 'summarization', 'creative_writing'].
* (optional) **response-suggestion** is of type `text`.
* **✨ NEW** **Vectors**: As of Argilla 1.19.0, the vectors have been included in order to add support for similarity search to explore similar records based on vector search powered by the search engine defined. The vectors are optional and cannot be seen within the UI, those are uploaded and internally used. Also the vectors will always be optional, and only the dimensions previously defined in their settings.
* (optional) **prompt-similarity** is of type `float32` and has a dimension of (1, `768`).
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a supervised fine-tuning dataset that contains instructions. Please write the response to the instruction in the response field. Take the context into account when writing the response.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TaherMAfini/housing_data | ---
dataset_info:
features:
- name: bedrooms__ratio
dtype: float64
- name: rooms_per_house__ratio
dtype: float64
- name: people_per_house__ratio
dtype: float64
- name: log__total_bedrooms
dtype: float64
- name: log__total_rooms
dtype: float64
- name: log__population
dtype: float64
- name: log__households
dtype: float64
- name: log__median_income
dtype: float64
- name: geo__Cluster 0 similarity
dtype: float64
- name: geo__Cluster 1 similarity
dtype: float64
- name: geo__Cluster 2 similarity
dtype: float64
- name: geo__Cluster 3 similarity
dtype: float64
- name: geo__Cluster 4 similarity
dtype: float64
- name: geo__Cluster 5 similarity
dtype: float64
- name: geo__Cluster 6 similarity
dtype: float64
- name: geo__Cluster 7 similarity
dtype: float64
- name: geo__Cluster 8 similarity
dtype: float64
- name: geo__Cluster 9 similarity
dtype: float64
- name: cat__ocean_proximity_<1H OCEAN
dtype: float64
- name: cat__ocean_proximity_INLAND
dtype: float64
- name: cat__ocean_proximity_ISLAND
dtype: float64
- name: cat__ocean_proximity_NEAR BAY
dtype: float64
- name: cat__ocean_proximity_NEAR OCEAN
dtype: float64
- name: remainder__housing_median_age
dtype: float64
- name: remainder__income_cat
dtype: float64
splits:
- name: train
num_bytes: 3302400
num_examples: 16512
- name: test
num_bytes: 825600
num_examples: 4128
download_size: 3441982
dataset_size: 4128000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
pretty_name: Housing Data
size_categories:
- 10K<n<100K
---
# Dataset Card for Housing_data
<!-- Provide a quick summary of the dataset. -->
This dataset has information about the housing market in California. The data has been split into test and train sets, missing values have been imputed using the median and numerical values have been normalized.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [Housing](https://github.com/ageron/data/raw/main/housing.tgz) |
DeepFoldProtein/SCOP-1.65-New-Clu80 | ---
dataset_info:
features:
- name: pdb_id_chain
dtype: string
- name: domain_ids
dtype: string
- name: domain_boundaries
dtype: string
- name: ndom
dtype: int64
- name: is_dis
dtype: int64
- name: seq
dtype: string
splits:
- name: train
num_bytes: 1740881.5977725184
num_examples: 5752
download_size: 1513543
dataset_size: 1740881.5977725184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zhen-di/test-2 | ---
license: mit
---
|
tyzhu/random_letter_same_length_find_passage_train100_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 74307
num_examples: 240
- name: validation
num_bytes: 15541
num_examples: 40
download_size: 50591
dataset_size: 89848
---
# Dataset Card for "random_letter_same_length_find_passage_train100_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chathuranga-jayanath/context-5-finmath-times4j-html-mavendoxia-wro4j-guava-supercsv-len-10000-prompt-0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 24095868
num_examples: 37517
- name: validation
num_bytes: 3030934
num_examples: 4689
- name: test
num_bytes: 3043669
num_examples: 4689
download_size: 12245462
dataset_size: 30170471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
tyzhu/ds1_100_try_lora_merge | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10442.47619047619
num_examples: 100
- name: validation
num_bytes: 10442.47619047619
num_examples: 100
download_size: 15570
dataset_size: 20884.95238095238
---
# Dataset Card for "ds1_100_try_lora_merge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UCL-DARK/openai-tldr-summarisation-preferences | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- crowdsourced
- expert-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: summarisation feedback
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- alignment
- text-classification
- summarisation
- human-feedback
task_categories:
- text-classification
task_ids: []
---
# Human feedback data
This is the version of the dataset used in https://arxiv.org/abs/2310.06452.
If starting a new project we would recommend using https://huggingface.co/datasets/openai/summarize_from_feedback.
See https://github.com/openai/summarize-from-feedback for original details of the dataset.
Here the data is formatted to enable huggingface transformers sequence classification models to be trained as reward functions.
|
vidhikatkoria/SGD_Calendar | ---
dataset_info:
features:
- name: domain
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: act
dtype: int64
- name: speaker
dtype: int64
splits:
- name: train
num_bytes: 647408.8420239475
num_examples: 2588
- name: test
num_bytes: 352
num_examples: 1
download_size: 235037
dataset_size: 647760.8420239475
---
# Dataset Card for "SGD_Calendar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kurihara_nene_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kurihara_nene/栗原ネネ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kurihara_nene/栗原ネネ (THE iDOLM@STER: Cinderella Girls), containing 47 images and their tags.
The core tags of this character are `long_hair, black_hair, blue_eyes, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 47 | 33.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurihara_nene_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 47 | 25.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurihara_nene_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 82 | 42.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurihara_nene_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 47 | 31.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurihara_nene_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 82 | 52.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurihara_nene_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kurihara_nene_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, smile, solo, black_eyes, blush, necklace, bare_shoulders, closed_mouth, looking_at_viewer, bracelet, earrings, frills, hair_bow, hair_flower, open_mouth, sleeveless_dress, upper_body |
| 1 | 12 |  |  |  |  |  | smile, blush, 1girl, navel, open_mouth, solo, braid, midriff, multiple_girls, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | black_eyes | blush | necklace | bare_shoulders | closed_mouth | looking_at_viewer | bracelet | earrings | frills | hair_bow | hair_flower | open_mouth | sleeveless_dress | upper_body | navel | braid | midriff | multiple_girls | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:--------|:-----------|:-----------------|:---------------|:--------------------|:-----------|:-----------|:---------|:-----------|:--------------|:-------------|:-------------------|:-------------|:--------|:--------|:----------|:-----------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | X | | | X | X | X | X | X |
|
javatask/eidas | ---
license: cc-by-sa-4.0
task_categories:
- conversational
- text-generation
language:
- en
tags:
- eidas
- legal
- code
pretty_name: electronic IDentification, Authentication and trust Services
size_categories:
- n<1K
---
# eIDAS Terminology Dataset
## Dataset Description
### Overview
The EiDAS Terminology dataset is a comprehensive collection of terms and abbreviations related to electronic identification and trust services for electronic transactions in the European Single Market (eIDAS). This dataset provides clear definitions and explanations of various terms, making it an essential resource for researchers and practitioners in digital identity and security.
### Languages
The primary language of the dataset is English.
## Dataset Structure
### Data Instances
A typical data point in the dataset contains the following structure:
- `idx`: An index number for the entry.
- `prompt`: The term or abbreviation to be defined.
- `completion`: The definition or explanation of the term.
- `label`: A label categorizing the term. Truth 1 of False 0
- `lang`: Language of the entry (English).
- `URL`: A link to a relevant external resource. Separed by commas
- `sourceURL`: The source URL from where the information is derived.
- `ts`: Timestamp when the entry was added. POSIX Timestamp - seconds
### Example
Here is an example from the dataset:
```json
{
"idx": "1",
"prompt": "What does ARF-Architecture and Reference Framework mean?",
"completion": "Defines core requirements on EUDI Wallet, PID and QEAA so formats, interfaces and protocols. Developed by eIDAS Toolbox Group.",
"label": 1,
"lang": "en",
"URL": "https://digital-strategy.ec.europa.eu/en/library/european-digital-identity-architecture-and-reference-framework-outline",
"sourceURL": "https://medium.com/@schwalm.steffen/collection-of-eidas-identity-related-terms-and-abbreviations-d14eada34364",
"ts": 1703588321
}
```
## Additional Information
### Dataset Curators
The dataset was compiled by [Andrii Melashchenko](https://www.linkedin.com/in/melashchenkoandrii), drawing on various sources, including online articles and official documents related to eIDAS.
The first version is based on the article by [Steffen Schwalm](https://www.linkedin.com/in/steffen-schwalm-a383b8112) https://medium.com/@schwalm.steffen/collection-of-eidas-identity-related-terms-and-abbreviations-d14eada34364
### Licensing Information
https://creativecommons.org/licenses/by-sa/4.0/deed.en
### Citation Information
!TODO
## Dataset Creation
### Curation Rationale
The dataset was created to provide a comprehensive and accessible resource for understanding the terminology related to eIDAS.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from various sources, including official eIDAS documents and relevant online articles.
#### Who are the source language producers?
The primary source language producers are experts and authors in digital identity and eIDAS regulation.
### Annotations
#### Annotation process
Each term is annotated with its definition, relevant URL, and source information.
#### Who are the annotators?
The annotators are individuals with expertise in digital identity and eIDAS regulations.
|
VLegio/ru_glaive-function-calling-v2-formatted | ---
language:
- ru
pretty_name: Russian machine translated funcction calling dataset
tags:
- func_call
- machine_translate
---
- original dataset: [heegyu/glaive-function-calling-v2-formatted](https://huggingface.co/datasets/heegyu/glaive-function-calling-v2-formatted)
---
Датасет heegyu/glaive-function-calling-v2-formatted переведенный с помощбю seamlessMT4 |
macadeliccc/distilabel-neurology-preferences-hf | ---
language:
- en
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
sequence: string
splits:
- name: train
num_bytes: 3076288
num_examples: 500
download_size: 1363349
dataset_size: 3076288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "distilabel-neurology-preferences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-xsum-default-cf6255-33263145015 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: t5-small
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-small
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@thefirebanks](https://huggingface.co/thefirebanks) for evaluating this model. |
abraar237/pill | ---
license: apache-2.0
---
|
masoudjs/c4-en-html-with-metadata-ppl-clean | ---
license: unknown
---
File list:
"c4-en-html_cc-main-2019-18_pq00-000.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-001.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-002.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-003.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-004.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-005.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-006.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-007.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-008.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-009.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-010.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-011.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-012.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-013.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-014.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-015.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-016.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-017.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-018.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-019.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-020.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-021.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-022.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-023.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-024.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-025.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-026.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-027.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-028.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-029.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-030.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-031.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-032.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-033.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-034.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-035.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-036.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-037.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-038.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-039.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-040.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-041.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-042.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-043.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-044.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-045.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-046.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-047.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-048.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-049.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-050.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-051.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-052.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-053.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-054.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-055.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-056.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-057.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-058.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-059.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-060.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-061.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-062.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-063.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-064.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-065.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-066.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-067.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-068.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-069.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-070.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-071.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-072.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-073.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-074.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-075.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-076.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-077.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-078.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-079.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-080.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-081.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-082.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-083.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-084.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-085.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-086.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-087.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-088.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-089.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-090.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-091.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-092.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-093.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-094.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-095.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-096.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-097.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-098.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-099.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-100.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-101.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-102.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-104.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-105.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-106.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-107.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-108.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-109.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-110.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-111.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-112.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-113.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-114.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-115.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-116.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-117.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-118.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-119.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-120.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-121.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-122.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-123.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-124.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-125.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-126.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-127.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-128.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-129.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-130.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-131.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-132.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-133.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-134.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-135.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-136.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-137.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-138.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-139.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-140.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-141.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-142.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-143.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-144.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-145.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-146.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-147.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-150.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-151.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-152.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-153.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-154.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-155.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-156.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-157.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-158.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-159.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-160.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-161.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-162.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-163.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-164.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-165.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-166.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-167.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-168.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-169.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-170.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-171.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-172.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-173.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-174.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-175.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-176.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-177.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-178.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-179.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-180.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-181.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-182.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-183.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-184.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-185.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-186.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-187.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-188.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-189.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-190.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-191.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-192.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-193.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-194.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-195.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-196.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-197.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-198.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-199.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-200.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-201.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-202.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-203.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-204.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-205.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-206.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-207.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-208.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-209.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-210.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-211.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-212.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-213.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-214.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-215.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-216.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-217.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-218.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-219.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-220.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-221.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-222.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-223.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-224.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-225.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-226.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-227.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-228.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-229.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-230.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-231.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-232.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-233.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-234.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-235.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-236.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-237.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-238.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-239.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-240.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-241.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-242.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-243.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq00-244.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-000.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-001.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-002.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-003.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-004.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-005.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-006.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-007.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-008.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-009.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-010.jsonl.gz",
"c4-en-html_cc-main-2019-18_pq01-011.jsonl.gz",
|
liuyanchen1015/MULTI_VALUE_wnli_relativizer_doubling | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1330
num_examples: 6
- name: test
num_bytes: 5663
num_examples: 16
- name: train
num_bytes: 12690
num_examples: 45
download_size: 17002
dataset_size: 19683
---
# Dataset Card for "MULTI_VALUE_wnli_relativizer_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1 | ---
pretty_name: Evaluation run of gagan3012/MetaModel_moe_multilingualv1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gagan3012/MetaModel_moe_multilingualv1](https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T03:30:47.395820](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1/blob/main/results_2024-01-08T03-30-47.395820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6413050326508205,\n\
\ \"acc_stderr\": 0.03216999625951096,\n \"acc_norm\": 0.6434280464159652,\n\
\ \"acc_norm_stderr\": 0.032808286279978185,\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6122907163213582,\n\
\ \"mc2_stderr\": 0.015399869141225538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.636518771331058,\n \"acc_stderr\": 0.014056207319068285,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6665006970722963,\n\
\ \"acc_stderr\": 0.004704996294145034,\n \"acc_norm\": 0.8473411670981876,\n\
\ \"acc_norm_stderr\": 0.0035892328893065324\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105655,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.02447224384089551,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.02447224384089551\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n\
\ \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n\
\ \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768417,\n\
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768417\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799802,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6122907163213582,\n\
\ \"mc2_stderr\": 0.015399869141225538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774095\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \
\ \"acc_stderr\": 0.013504357787494042\n }\n}\n```"
repo_url: https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-34-11.693561.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-30-47.395820.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- '**/details_harness|winogrande|5_2024-01-08T02-34-11.693561.parquet'
- split: 2024_01_08T03_30_47.395820
path:
- '**/details_harness|winogrande|5_2024-01-08T03-30-47.395820.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T03-30-47.395820.parquet'
- config_name: results
data_files:
- split: 2024_01_08T02_34_11.693561
path:
- results_2024-01-08T02-34-11.693561.parquet
- split: 2024_01_08T03_30_47.395820
path:
- results_2024-01-08T03-30-47.395820.parquet
- split: latest
path:
- results_2024-01-08T03-30-47.395820.parquet
---
# Dataset Card for Evaluation run of gagan3012/MetaModel_moe_multilingualv1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/MetaModel_moe_multilingualv1](https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:30:47.395820](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1/blob/main/results_2024-01-08T03-30-47.395820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6413050326508205,
"acc_stderr": 0.03216999625951096,
"acc_norm": 0.6434280464159652,
"acc_norm_stderr": 0.032808286279978185,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6122907163213582,
"mc2_stderr": 0.015399869141225538
},
"harness|arc:challenge|25": {
"acc": 0.636518771331058,
"acc_stderr": 0.014056207319068285,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6665006970722963,
"acc_stderr": 0.004704996294145034,
"acc_norm": 0.8473411670981876,
"acc_norm_stderr": 0.0035892328893065324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105655,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089551,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089551
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977945,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977945
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768417,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768417
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667885,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667885
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6122907163213582,
"mc2_stderr": 0.015399869141225538
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774095
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_sst2_double_modals | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 23385
num_examples: 145
- name: test
num_bytes: 56107
num_examples: 352
- name: train
num_bytes: 726940
num_examples: 5809
download_size: 447589
dataset_size: 806432
---
# Dataset Card for "MULTI_VALUE_sst2_double_modals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NousResearch/func-calling-eval | ---
dataset_info:
features:
- name: system
dtype: string
- name: user
dtype: string
- name: prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: completion
dtype: string
- name: tools
dtype: string
splits:
- name: train
num_bytes: 174019
num_examples: 100
download_size: 54818
dataset_size: 174019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mnoukhov/openai_summarize_comparisons_tldrprompt | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid1
path: data/valid1-*
- split: valid2
path: data/valid2-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 156593160
num_examples: 92534
- name: test
num_bytes: 142265844
num_examples: 83629
- name: valid1
num_bytes: 56388492
num_examples: 33082
- name: valid2
num_bytes: 85940008
num_examples: 50715
download_size: 61340645
dataset_size: 441187504
---
# Dataset Card for "openai_summarize_comparisons_tldrprompt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
voldasd/Robotnik | ---
license: openrail
---
|
dkoterwa/oasst1_filtered | ---
dataset_info:
features:
- name: lang
dtype: string
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: review_count
dtype: int64
splits:
- name: train
num_bytes: 49309079
num_examples: 45312
download_size: 24581389
dataset_size: 49309079
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# OASST1 filtered version
For a better dataset description, please visit the official site of the source dataset: [LINK](https://huggingface.co/datasets/OpenAssistant/oasst1) <br>
<br>
**This dataset was prepared by converting OASST1 dataset**. I took every unique answer and then searched for its query. The obtained dataset can be used for retrieval evaluation.
**I additionaly share the code which I used to convert the original dataset to make everything more clear**
```
oass_train = load_dataset("OpenAssistant/oasst1", split="train").to_pandas()
oass_valid = load_dataset("OpenAssistant/oasst1", split="validation").to_pandas()
oass_full = pd.concat([oass_train, oass_valid,])
oass_full.reset_index(drop=True, inplace=True)
needed_langs = ["en", "ar", "de", "es", "vi", "zh"]
rows = []
for lang in tqdm(needed_langs):
print(f"Processing lang: {lang}")
filtered_df = oass_full[(oass_full["lang"] == lang) & (oass_full["role"] == "assistant")]
for i, answer in tqdm(filtered_df.iterrows()):
query = oass_full[oass_full["message_id"] == answer["parent_id"]]["text"].iloc[0]
rows.append([answer["lang"], answer["message_id"], answer["parent_id"], answer["user_id"], answer["created_date"], query, answer["text"], answer["review_count"]])
filtered_dataset = pd.DataFrame(rows, columns=["lang", "message_id", "parent_id", "user_id", "created_date", "query", "answer", "review_count"])
filtered_dataset.drop_duplicates(subset="answer", inplace=True)
filtered_dataset.reset_index(drop=True, inplace=True)
```
**How to download**
```
from datasets import load_dataset
data = load_dataset("dkoterwa/oasst1_filtered_retrieval")
``` |
pteron/processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 173314800.0
num_examples: 48143
download_size: 41856821
dataset_size: 173314800.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SkySyrup/hitchhiker-fixed | ---
license: apache-2.0
---
|
JaredS129/gaussian-splats | ---
license: wtfpl
---
|
gowitheflowlab/parallel-medium | ---
dataset_info:
features:
- name: English
dtype: string
- name: Other Language
dtype: string
splits:
- name: train
num_bytes: 5835572287
num_examples: 25917714
download_size: 3778109034
dataset_size: 5835572287
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChengAoShen/emoji_with_text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 197767083.176
num_examples: 47192
download_size: 150864115
dataset_size: 197767083.176
license: mit
task_categories:
- text-to-image
language:
- en
tags:
- art
size_categories:
- 10K<n<100K
---
# "Emoji_for_diffusion" Dataset
## Description
This data set includes various style emoji and their description from different apps.
Each image is sized with 64*64, which is easy to train in your personal GPU, and has **RGBA** channels.
The description text is formatted as follows:
```
app/company + emoji content + description information
```
You can use this dataset to train your personal diffusion model. I sincerely hope this dataset can help your research work.
## Citation
If you use this dataset, please cite it as:
```
@misc{ChengAoShen2023emoji,
author = {ChengAo Shen and Siyuan Mu},
title = {emoji_for_diffusion},
year={2023},
howpublished= {\url{https://huggingface.co/datasets/ChengAoShen/emoji_for_diffusion/}}
}
```
|
CyberHarem/yamada_elf_eromangasensei | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yamada Elf
This is the dataset of Yamada Elf, containing 335 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 335 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 810 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 866 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 335 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 335 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 335 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 810 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 810 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 688 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 866 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 866 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
nhantruongcse/dataset_test_12k5 | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 94104739.22247884
num_examples: 12500
download_size: 41705065
dataset_size: 94104739.22247884
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mohaddeseh/BioNLI | ---
license: cc
---
|
FaalSa/dfaas1 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57633
num_examples: 1
- name: validation
num_bytes: 58113
num_examples: 1
- name: test
num_bytes: 58593
num_examples: 1
download_size: 26540
dataset_size: 174339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
heliosprime/twitter_dataset_1713214596 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24760
num_examples: 69
download_size: 21181
dataset_size: 24760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713214596"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/dollyaug-standardized_cluster_3_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2400122
num_examples: 1326
download_size: 1434738
dataset_size: 2400122
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dollyaug-standardized_cluster_3_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mikikk/miscellaneous-embedds | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_qqp_indefinite_for_zero | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5917250
num_examples: 36410
- name: test
num_bytes: 58491304
num_examples: 357650
- name: train
num_bytes: 53405562
num_examples: 328602
download_size: 72351895
dataset_size: 117814116
---
# Dataset Card for "MULTI_VALUE_qqp_indefinite_for_zero"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sharad/paws-wiki | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 80177815
num_examples: 344655
- name: test
num_bytes: 2822062
num_examples: 12075
download_size: 59255186
dataset_size: 82999877
---
# Dataset Card for "paws-wiki"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fanfare71/testset02 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5132
num_examples: 30
download_size: 4057
dataset_size: 5132
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kelvin878/pushpins | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
- name: mask_image
dtype: image
splits:
- name: train
num_bytes: 1159066821.0
num_examples: 388
download_size: 810120807
dataset_size: 1159066821.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Erynan/eval_virt_100 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: more_reasonable
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 35885
num_examples: 100
download_size: 17660
dataset_size: 35885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
beta3/somos-clean-alpaca-es-validations | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
struct:
- name: tr-flag-1-instruction
dtype: bool
- name: tr-flag-2-input
dtype: bool
- name: tr-flag-3-output
dtype: bool
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 41887867
num_examples: 2205
download_size: 28335305
dataset_size: 41887867
---
# Dataset Card for "somos-clean-alpaca-es-validations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abacusai__bigyi-15b | ---
pretty_name: Evaluation run of abacusai/bigyi-15b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/bigyi-15b](https://huggingface.co/abacusai/bigyi-15b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__bigyi-15b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T12:57:48.733111](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigyi-15b/blob/main/results_2024-03-10T12-57-48.733111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6368698545001125,\n\
\ \"acc_stderr\": 0.03234846946993144,\n \"acc_norm\": 0.6464868762775356,\n\
\ \"acc_norm_stderr\": 0.03302252322767117,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.37326698740277925,\n\
\ \"mc2_stderr\": 0.01461558650400129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995426,\n\
\ \"acc_norm\": 0.560580204778157,\n \"acc_norm_stderr\": 0.014503747823580127\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5749850627365066,\n\
\ \"acc_stderr\": 0.004933349621589336,\n \"acc_norm\": 0.7590121489743079,\n\
\ \"acc_norm_stderr\": 0.004268088879039825\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5317460317460317,\n \"acc_stderr\": 0.025699352832131792,\n \"\
acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.025699352832131792\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374291,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374291\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636861,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636861\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.37326698740277925,\n\
\ \"mc2_stderr\": 0.01461558650400129\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21607278241091737,\n \
\ \"acc_stderr\": 0.011336531489638873\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/bigyi-15b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|winogrande|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T12-57-48.733111.parquet'
- config_name: results
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- results_2024-03-10T12-57-48.733111.parquet
- split: latest
path:
- results_2024-03-10T12-57-48.733111.parquet
---
# Dataset Card for Evaluation run of abacusai/bigyi-15b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/bigyi-15b](https://huggingface.co/abacusai/bigyi-15b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__bigyi-15b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T12:57:48.733111](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigyi-15b/blob/main/results_2024-03-10T12-57-48.733111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6368698545001125,
"acc_stderr": 0.03234846946993144,
"acc_norm": 0.6464868762775356,
"acc_norm_stderr": 0.03302252322767117,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.37326698740277925,
"mc2_stderr": 0.01461558650400129
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.014580637569995426,
"acc_norm": 0.560580204778157,
"acc_norm_stderr": 0.014503747823580127
},
"harness|hellaswag|10": {
"acc": 0.5749850627365066,
"acc_stderr": 0.004933349621589336,
"acc_norm": 0.7590121489743079,
"acc_norm_stderr": 0.004268088879039825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374291,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374291
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636861,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636861
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188947,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.37326698740277925,
"mc2_stderr": 0.01461558650400129
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614657
},
"harness|gsm8k|5": {
"acc": 0.21607278241091737,
"acc_stderr": 0.011336531489638873
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pythainlp/scb_mt_2020_en2th_prompt | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 500257169
num_examples: 801402
- name: validation
num_bytes: 61671631
num_examples: 88927
- name: test
num_bytes: 61225544
num_examples: 88931
download_size: 212863737
dataset_size: 623154344
license: cc-by-sa-4.0
task_categories:
- text2text-generation
- text-classification
language:
- th
size_categories:
- 100K<n<1M
---
# Dataset Card for "scb_mt_2020_en2th_prompt"
This dataset made from [scb_mt_enth_2020](https://huggingface.co/datasets/scb_mt_enth_2020) that removed nus_sms and paracrawl from source.
Source code for create dataset: [https://github.com/PyThaiNLP/support-aya-datasets/blob/main/translation/scb_mt.ipynb](https://github.com/PyThaiNLP/support-aya-datasets/blob/main/translation/scb_mt.ipynb)
## Template
```
Inputs: แปลประโยคหรือย่อหน้าต่อไปนี้จากภาษาอังกฤษเป็นภาษาไทย:\n{en}
Targets: Thai sentence
``` |
huggingartists/loverance | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/loverance"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.079792 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/a8a06b82765b2451bf65b21cf4384901.291x291x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/loverance">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">LoveRance</div>
<a href="https://genius.com/artists/loverance">
<div style="text-align: center; font-size: 14px;">@loverance</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/loverance).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/loverance")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|23| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/loverance")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Fernandess/ISIC_1000_Melanoma | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 225038989
num_examples: 800
- name: validation
num_bytes: 56414609
num_examples: 200
download_size: 281199076
dataset_size: 281453598
task_categories:
- image-segmentation
size_categories:
- n<1K
---
# Dataset Card for "ISIC_1000_Melanoma"
Binary Image Segmentation of Skin Lesions is a pivotal task in dermatology and medical imaging aimed at accurately delineating regions of interest within skin images. Skin lesions encompass various anomalies, including moles, freckles, and potentially malignant melanomas. The process involves partitioning the image into two distinct categories: the lesion area and the surrounding healthy skin. Through sophisticated computational algorithms and image processing techniques, features such as color, texture, and morphology are analyzed to differentiate between normal and abnormal tissue. This segmentation is instrumental in early detection, precise diagnosis, and treatment planning for skin conditions, enabling clinicians to make informed decisions and improve patient outcomes.
|
open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B | ---
pretty_name: Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/merged-dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T13:35:30.829046](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T13-35-30.829046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522917946377467,\n\
\ \"acc_stderr\": 0.032023116879680597,\n \"acc_norm\": 0.6514554167254718,\n\
\ \"acc_norm_stderr\": 0.03269668311247275,\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690258519527344,\n\
\ \"mc2_stderr\": 0.01393738583634334\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653884,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635751\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7152957578171679,\n\
\ \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.8902609042023502,\n\
\ \"acc_norm_stderr\": 0.003119254828848945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903348,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903348\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690258519527344,\n\
\ \"mc2_stderr\": 0.01393738583634334\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \
\ \"acc_stderr\": 0.012748860507777727\n }\n}\n```"
repo_url: https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|arc:challenge|25_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|gsm8k|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hellaswag|10_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T13-35-30.829046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- '**/details_harness|winogrande|5_2024-02-12T13-35-30.829046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T13-35-30.829046.parquet'
- config_name: results
data_files:
- split: 2024_02_12T13_35_30.829046
path:
- results_2024-02-12T13-35-30.829046.parquet
- split: latest
path:
- results_2024-02-12T13-35-30.829046.parquet
---
# Dataset Card for Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/merged-dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T13:35:30.829046](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T13-35-30.829046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522917946377467,
"acc_stderr": 0.032023116879680597,
"acc_norm": 0.6514554167254718,
"acc_norm_stderr": 0.03269668311247275,
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7690258519527344,
"mc2_stderr": 0.01393738583634334
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653884,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635751
},
"harness|hellaswag|10": {
"acc": 0.7152957578171679,
"acc_stderr": 0.0045035118550500325,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.003119254828848945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903348,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903348
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580428,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7690258519527344,
"mc2_stderr": 0.01393738583634334
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627297
},
"harness|gsm8k|5": {
"acc": 0.689158453373768,
"acc_stderr": 0.012748860507777727
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DavidVivancos/MindBigData2022_VisMNIST_Cap64_Morlet | ---
license: odbl
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b47a6b0d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1340
dataset_size: 178
---
# Dataset Card for "b47a6b0d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lunaBOT/myDocDataSet | ---
license: openrail
---
|
juancopi81/dianauribelarge | ---
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 24130463
num_examples: 371
download_size: 11409735
dataset_size: 24130463
tags:
- whisper
- whispering
- large
---
# Dataset Card for "dianauribelarge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLRS/maltese_news_headlines | ---
dataset_info:
features:
- name: category
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: text_raw
sequence: string
- name: base_url
dtype:
class_label:
names:
'0': inewsmalta.com
'1': netnews.com.mt
'2': newsbook.com.mt
'3': one.com.mt
'4': stradarjali.com
'5': www.gwida.mt
'6': www.illum.com.mt
'7': www.tvm.com.mt
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 63559985.55997323
num_examples: 17782
- name: validation
num_bytes: 13618465.019879542
num_examples: 3810
- name: test
num_bytes: 13622039.420147227
num_examples: 3811
download_size: 55694312
dataset_size: 90800490
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: cc-by-nc-sa-4.0
task_categories:
- summarization
language:
- mt
pretty_name: Maltese News Headlines
size_categories:
- 10K<n<100K
---
# Maltese News Headlines
A headline-article pairs dataset for Maltese News Articles.
This dataset is intended to be used for headline generation from the article content.
## Data Collection
The data was collected from the [`press_mt` subset from Korpus Malti v4.0](https://huggingface.co/datasets/MLRS/korpus_malti/viewer/press_mt).
Article contents were cleaned to filter out JavaScript, CSS, & repeated non-Maltese sub-headings.
The title and base URL features are based on the `title` & `url` fields from this corpus, respectively.
## Additional Information
### License
This work is licensed under a [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
Permissions beyond the scope of this license may be available at [https://mlrs.research.um.edu.mt/](https://mlrs.research.um.edu.mt/).
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
## Citation
This work was first presented in [Topic Classification and Headline Generation for Maltese using a Public News Corpus](#).
Cite it as follows:
```bibtex
@inproceedings{maltese-news-datasets,
title = "Topic Classification and Headline Generation for {M}altese using a Public News Corpus",
author = "Chaudhary, Amit Kumar and
Micallef, Kurt and
Borg, Claudia",
booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation",
month = may,
year = "2024",
publisher = "Association for Computational Linguistics",
}
``` |
erwinqi/conslam_relabelled | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: segments_info
list:
- name: area
dtype: float64
- name: bbox
sequence: float64
- name: category_id
dtype: int64
- name: id
dtype: int64
- name: iscrowd
dtype: int64
splits:
- name: train
num_bytes: 409510607.0
num_examples: 88
- name: validation
num_bytes: 50062870.0
num_examples: 10
download_size: 455344940
dataset_size: 459573477.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
kkboy1/LeAudio | ---
annotations_creators: []
language: []
language_creators: []
license: []
multilinguality: []
pretty_name: LE AUDIO BOOK
size_categories: []
source_datasets: []
tags: []
task_categories:
- text2text-generation
task_ids: []
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 687531
num_examples: 10020
- name: test
num_bytes: 687531
num_examples: 10020
download_size: 725338
dataset_size: 1375062
---
# Dataset Card for [LE Audio]
Dataset Card
Dataset Name: LE Audio Dataset
Dataset Version: 1.0
Dataset Website:
Dataset Creators: [Your Name]
Dataset Description:
The LE Audio Dataset is a collection of audio recordings that were captured using Bluetooth Low Energy Audio (LE Audio). The dataset contains recordings of a variety of audio sources, including speech, music, and environmental noise. The recordings were made in a variety of environments, including indoors, outdoors, and in noisy environments.
Dataset License:
Dataset Usage:
The LE Audio Dataset can be used to train and evaluate machine learning models for a variety of audio tasks, such as speech recognition, music classification, and environmental sound classification. The dataset is also useful for research on LE Audio itself.
Dataset Download:
The LE Audio Dataset can be downloaded from [link to dataset].
Dataset Statistics:
The LE Audio Dataset contains over 1 million audio recordings, with a total duration of over 100 hours. The recordings are divided into two splits: train (80%) and test (20%).
Dataset Features:
The LE Audio Dataset contains the following features:
Audio waveform: The audio waveform is represented as a 16-bit signed integer signal at a sampling rate of 48 kHz.
Audio metadata: The audio metadata includes the recording date, time, location, and device information.
Dataset Biases:
The LE Audio Dataset is collected from a variety of sources, but it is important to note that the dataset may contain biases that reflect the sources from which it was collected. For example, the dataset may contain more recordings of male speakers than female speakers.
Dataset Citation:
To cite the LE Audio Dataset, please use the following BibTeX entry:
@article{le_audio_dataset,
author={Your Name},
title={LE Audio Dataset},
year={2023},
url={link to dataset}
} |
distilled-one-sec-cv12-each-chunk-uniq/chunk_261 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 945950768.0
num_examples: 184324
download_size: 966574555
dataset_size: 945950768.0
---
# Dataset Card for "chunk_261"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai2lumos/lumos_maths_ground_onetime | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- language-agent
- maths
- reasoning
- grounding
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_maths_ground_onetime` is the data for training **grounding** module on **maths** task in **Lumos-Onetime (Lumos-O)** formulation.
The source of the training annotation training data is shown below:
| Task | Number |
|---|---|
|PRM800K|10000|
|GSM8K|7473|
|ASDiv|2305|
## Models Trained with the Data
`lumos_maths_ground_onetime` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_maths_ground_onetime`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_maths_ground_onetime) |
|`lumos_maths_ground_onetime-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_maths_ground_onetime-13B) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
phd411r1/news_classification | ---
dataset_info:
features:
- name: title
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 850939
num_examples: 7997
- name: test
num_bytes: 178204
num_examples: 1669
download_size: 551232
dataset_size: 1029143
---
# Dataset Card for "news_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_arc_tr_conf_mgpt_nearestscore_true | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 50777
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SaiedAlshahrani/Moroccan_Arabic_Wikipedia_20230101_bots | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7596217
num_examples: 5396
download_size: 2958669
dataset_size: 7596217
license: mit
language:
- ar
pretty_name: arywiki-articles-withbots
size_categories:
- 1K<n<10K
---
# Dataset Card for "Moroccan_Arabic_Wikipedia_20230101_bots"
This dataset is created using the Moroccan Arabic Wikipedia articles, downloaded on the 1st of January 2023, processed using `Gensim` Python library, and preprocessed using `tr` Linux/Unix utility and `CAMeLTools` Python toolkit for Arabic NLP. This dataset was used to train this Moroccan Arabic Wikipedia Masked Language Model: [SaiedAlshahrani/arywiki_20230101_roberta_mlm_bots](https://huggingface.co/SaiedAlshahrani/arywiki_20230101_roberta_mlm_bots).
For more details about the dataset, please **read** and **cite** our paper:
```bash
@inproceedings{alshahrani-etal-2023-performance,
title = "{Performance Implications of Using Unrepresentative Corpora in {A}rabic Natural Language Processing}",
author = "Alshahrani, Saied and Alshahrani, Norah and Dey, Soumyabrata and Matthews, Jeanna",
booktitle = "Proceedings of the The First Arabic Natural Language Processing Conference (ArabicNLP 2023)",
month = December,
year = "2023",
address = "Singapore (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.arabicnlp-1.19",
doi = "10.18653/v1/2023.arabicnlp-1.19",
pages = "218--231",
abstract = "Wikipedia articles are a widely used source of training data for Natural Language Processing (NLP) research, particularly as corpora for low-resource languages like Arabic. However, it is essential to understand the extent to which these corpora reflect the representative contributions of native speakers, especially when many entries in a given language are directly translated from other languages or automatically generated through automated mechanisms. In this paper, we study the performance implications of using inorganic corpora that are not representative of native speakers and are generated through automated techniques such as bot generation or automated template-based translation. The case of the Arabic Wikipedia editions gives a unique case study of this since the Moroccan Arabic Wikipedia edition (ARY) is small but representative, the Egyptian Arabic Wikipedia edition (ARZ) is large but unrepresentative, and the Modern Standard Arabic Wikipedia edition (AR) is both large and more representative. We intrinsically evaluate the performance of two main NLP upstream tasks, namely word representation and language modeling, using word analogy evaluations and fill-mask evaluations using our two newly created datasets: Arab States Analogy Dataset (ASAD) and Masked Arab States Dataset (MASD). We demonstrate that for good NLP performance, we need both large and organic corpora; neither alone is sufficient. We show that producing large corpora through automated means can be a counter-productive, producing models that both perform worse and lack cultural richness and meaningful representation of the Arabic language and its native speakers.",
}
``` |
CyberHarem/takagaki_kaede_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of takagaki_kaede/高垣楓/타카가키카에데 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of takagaki_kaede/高垣楓/타카가키카에데 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `mole, mole_under_eye, short_hair, blue_eyes, green_eyes, heterochromia, brown_hair, bangs, breasts, green_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 701.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takagaki_kaede_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 421.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takagaki_kaede_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1210 | 899.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takagaki_kaede_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 637.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takagaki_kaede_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1210 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/takagaki_kaede_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/takagaki_kaede_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, smile, bare_shoulders, white_background, collarbone, fringe_trim, blush, dress, upper_body |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, legwear_under_shorts, pantyhose, solo, fringe_trim, looking_at_viewer, smile |
| 2 | 9 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, blush, smile, alcohol, fringe_trim, cup |
| 3 | 7 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, hair_flower, looking_at_viewer, solo, white_gloves, smile, green_dress, simple_background, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, floral_print, hair_flower, long_sleeves, looking_at_viewer, obi, print_kimono, smile, solo, wide_sleeves, blush, upper_body, blurry, closed_mouth, holding_umbrella, oil-paper_umbrella, parted_lips, purple_flower, red_kimono, yellow_kimono |
| 5 | 5 |  |  |  |  |  | 1girl, blue_skirt, collarbone, floral_print, looking_at_viewer, necklace, sleeveless_shirt, sun_hat, yellow_shirt, blush, solo, handbag, print_skirt, :d, blurry_background, day, disposable_cup, dutch_angle, holding_cup, open_mouth, outdoors, parted_lips, shoulder_bag |
| 6 | 14 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, navel, necklace, smile, day, outdoors, blush, cloud, white_bikini, bare_shoulders, collarbone, front-tie_top, blue_sky, ocean, palm_tree, beach, earrings, parted_lips, sitting, bangle, drinking_glass, holding_cup, sunlight, water |
| 7 | 7 |  |  |  |  |  | 1girl, blush, cat_ears, cat_tail, looking_at_viewer, short_sleeves, solo, black_dress, black_thighhighs, nail_polish, smile, tail_ornament, animal_ear_fluff, cat_girl, jewelry, zettai_ryouiki, halloween, open_mouth, puffy_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | smile | bare_shoulders | white_background | collarbone | fringe_trim | blush | dress | upper_body | legwear_under_shorts | pantyhose | alcohol | cup | elbow_gloves | hair_flower | white_gloves | green_dress | floral_print | long_sleeves | obi | print_kimono | wide_sleeves | blurry | closed_mouth | holding_umbrella | oil-paper_umbrella | parted_lips | purple_flower | red_kimono | yellow_kimono | blue_skirt | necklace | sleeveless_shirt | sun_hat | yellow_shirt | handbag | print_skirt | :d | blurry_background | day | disposable_cup | dutch_angle | holding_cup | open_mouth | outdoors | shoulder_bag | cleavage | navel | cloud | white_bikini | front-tie_top | blue_sky | ocean | palm_tree | beach | earrings | sitting | bangle | drinking_glass | sunlight | water | cat_ears | cat_tail | short_sleeves | black_dress | black_thighhighs | nail_polish | tail_ornament | animal_ear_fluff | cat_girl | jewelry | zettai_ryouiki | halloween | puffy_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:-----------------|:-------------------|:-------------|:--------------|:--------|:--------|:-------------|:-----------------------|:------------|:----------|:------|:---------------|:--------------|:---------------|:--------------|:---------------|:---------------|:------|:---------------|:---------------|:---------|:---------------|:-------------------|:---------------------|:--------------|:----------------|:-------------|:----------------|:-------------|:-----------|:-------------------|:----------|:---------------|:----------|:--------------|:-----|:--------------------|:------|:-----------------|:--------------|:--------------|:-------------|:-----------|:---------------|:-----------|:--------|:--------|:---------------|:----------------|:-----------|:--------|:------------|:--------|:-----------|:----------|:---------|:-----------------|:-----------|:--------|:-----------|:-----------|:----------------|:--------------|:-------------------|:--------------|:----------------|:-------------------|:-----------|:----------|:-----------------|:------------|:----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | X | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | | | | | X | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | X | X | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_2.7b_VQAv2_visclues_ns_16 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_8
num_bytes: 404667
num_examples: 16
download_size: 80966
dataset_size: 404667
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_2.7b_VQAv2_visclues_ns_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MLRush/oasst1 | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 7224
num_examples: 10
- name: validation
num_bytes: 7324
num_examples: 10
download_size: 34566
dataset_size: 14548
---
# Dataset Card for "oasst1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/chiori_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Chiori (Genshin Impact)
This is the dataset of Chiori (Genshin Impact), containing 163 images and their tags.
The core tags of this character are `hair_ornament, brown_hair, bangs, red_eyes, multicolored_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 163 | 336.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiori_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 163 | 153.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiori_genshin/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 413 | 336.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiori_genshin/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 163 | 277.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiori_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 413 | 549.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiori_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chiori_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, kimono, white_background, simple_background, elbow_gloves, streaked_hair, hair_flower, side_ponytail, pantyhose |
| 1 | 13 |  |  |  |  |  | 1girl, holding_sword, solo, katana, looking_at_viewer, black_gloves, kimono, thighhighs, pantyhose, flower |
| 2 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, penis, solo_focus, blush, looking_at_viewer, pussy, choker, medium_breasts, open_mouth, sex, spread_legs, vaginal, bar_censor, collarbone, drill_hair, girl_on_top, kimono, straddling, testicles |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | kimono | white_background | simple_background | elbow_gloves | streaked_hair | hair_flower | side_ponytail | pantyhose | holding_sword | katana | thighhighs | flower | 1boy | hetero | nipples | penis | solo_focus | blush | pussy | choker | medium_breasts | open_mouth | sex | spread_legs | vaginal | bar_censor | collarbone | drill_hair | girl_on_top | straddling | testicles |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:---------|:-------------------|:--------------------|:---------------|:----------------|:--------------|:----------------|:------------|:----------------|:---------|:-------------|:---------|:-------|:---------|:----------|:--------|:-------------|:--------|:--------|:---------|:-----------------|:-------------|:------|:--------------|:----------|:-------------|:-------------|:-------------|:--------------|:-------------|:------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
p00123456/hello | ---
license: apache-2.0
---
|
xanhho/2WikiMultihopQA | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
size_categories:
- 100K<n<1M
---
# 2WikiMultihopQA: A Dataset for Comprehensive Evaluation of Reasoning Steps
Official mirror of <https://github.com/Alab-NII/2wikimultihop> |
autoevaluate/autoeval-eval-futin__guess-vi_3-6b1064-2012566623 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-1.3b
metrics: []
dataset_name: futin/guess
dataset_config: vi_3
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-1.3b
* Dataset: futin/guess
* Config: vi_3
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
peterbeamish/environment-env-instruct1 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 32209217
num_examples: 914
- name: test
num_bytes: 29810746
num_examples: 915
download_size: 21565229
dataset_size: 62019963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Dnsibu/sn | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: tokens
dtype: string
- name: ner_tags
dtype: string
splits:
- name: train
num_bytes: 12419815
num_examples: 40885
download_size: 0
dataset_size: 12419815
---
# Dataset Card for "sn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-200453bd-7694960 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- masakhaner
eval_info:
task: entity_extraction
model: arnolfokam/mbert-base-uncased-swa
metrics: []
dataset_name: masakhaner
dataset_config: swa
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: arnolfokam/mbert-base-uncased-swa
* Dataset: masakhaner
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Svenni551/Test-Gemma-toxic | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 304256
num_examples: 133
download_size: 123796
dataset_size: 304256
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SSEF-HG-AC/cyberbullying-instagram-tiktok | ---
license: cc
---
|
quickdraw | ---
annotations_creators:
- machine-generated
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
paperswithcode_id: quick-draw-dataset
pretty_name: Quick, Draw!
dataset_info:
- config_name: raw
features:
- name: key_id
dtype: string
- name: word
dtype:
class_label:
names:
'0': aircraft carrier
'1': airplane
'2': alarm clock
'3': ambulance
'4': angel
'5': animal migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball bat
'17': baseball
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling fan
'66': cell phone
'67': cello
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip flops
'120': floor lamp
'121': flower
'122': flying saucer
'123': foot
'124': fork
'125': frog
'126': frying pan
'127': garden hose
'128': garden
'129': giraffe
'130': goatee
'131': golf club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey puck
'146': hockey stick
'147': horse
'148': hospital
'149': hot air balloon
'150': hot dog
'151': hot tub
'152': hourglass
'153': house plant
'154': house
'155': hurricane
'156': ice cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paint can
'206': paintbrush
'207': palm tree
'208': panda
'209': pants
'210': paper clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup truck
'221': picture frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote control
'242': rhinoceros
'243': rifle
'244': river
'245': roller coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea turtle
'256': see saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping bag
'267': smiley face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop sign
'289': stove
'290': strawberry
'291': streetlight
'292': string bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing set
'299': sword
'300': syringe
'301': t-shirt
'302': table
'303': teapot
'304': teddy-bear
'305': telephone
'306': television
'307': tennis racquet
'308': tent
'309': The Eiffel Tower
'310': The Great Wall of China
'311': The Mona Lisa
'312': tiger
'313': toaster
'314': toe
'315': toilet
'316': tooth
'317': toothbrush
'318': toothpaste
'319': tornado
'320': tractor
'321': traffic light
'322': train
'323': tree
'324': triangle
'325': trombone
'326': truck
'327': trumpet
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine bottle
'340': wine glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
- name: recognized
dtype: bool
- name: timestamp
dtype: timestamp[us, tz=UTC]
- name: countrycode
dtype: string
- name: drawing
sequence:
- name: x
sequence: float32
- name: y
sequence: float32
- name: t
sequence: int32
splits:
- name: train
num_bytes: 134763164880
num_examples: 50426266
download_size: 194810597157
dataset_size: 134763164880
- config_name: preprocessed_simplified_drawings
features:
- name: key_id
dtype: string
- name: word
dtype:
class_label:
names:
'0': aircraft carrier
'1': airplane
'2': alarm clock
'3': ambulance
'4': angel
'5': animal migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball bat
'17': baseball
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling fan
'66': cell phone
'67': cello
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip flops
'120': floor lamp
'121': flower
'122': flying saucer
'123': foot
'124': fork
'125': frog
'126': frying pan
'127': garden hose
'128': garden
'129': giraffe
'130': goatee
'131': golf club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey puck
'146': hockey stick
'147': horse
'148': hospital
'149': hot air balloon
'150': hot dog
'151': hot tub
'152': hourglass
'153': house plant
'154': house
'155': hurricane
'156': ice cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paint can
'206': paintbrush
'207': palm tree
'208': panda
'209': pants
'210': paper clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup truck
'221': picture frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote control
'242': rhinoceros
'243': rifle
'244': river
'245': roller coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea turtle
'256': see saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping bag
'267': smiley face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop sign
'289': stove
'290': strawberry
'291': streetlight
'292': string bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing set
'299': sword
'300': syringe
'301': t-shirt
'302': table
'303': teapot
'304': teddy-bear
'305': telephone
'306': television
'307': tennis racquet
'308': tent
'309': The Eiffel Tower
'310': The Great Wall of China
'311': The Mona Lisa
'312': tiger
'313': toaster
'314': toe
'315': toilet
'316': tooth
'317': toothbrush
'318': toothpaste
'319': tornado
'320': tractor
'321': traffic light
'322': train
'323': tree
'324': triangle
'325': trombone
'326': truck
'327': trumpet
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine bottle
'340': wine glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
- name: recognized
dtype: bool
- name: timestamp
dtype: timestamp[us, tz=UTC]
- name: countrycode
dtype: string
- name: drawing
sequence:
- name: x
sequence: uint8
- name: y
sequence: uint8
splits:
- name: train
num_bytes: 9741454188
num_examples: 50426266
download_size: 5889968422
dataset_size: 9741454188
- config_name: preprocessed_bitmaps
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': aircraft carrier
'1': airplane
'2': alarm clock
'3': ambulance
'4': angel
'5': animal migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball bat
'17': baseball
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling fan
'66': cell phone
'67': cello
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip flops
'120': floor lamp
'121': flower
'122': flying saucer
'123': foot
'124': fork
'125': frog
'126': frying pan
'127': garden hose
'128': garden
'129': giraffe
'130': goatee
'131': golf club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey puck
'146': hockey stick
'147': horse
'148': hospital
'149': hot air balloon
'150': hot dog
'151': hot tub
'152': hourglass
'153': house plant
'154': house
'155': hurricane
'156': ice cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paint can
'206': paintbrush
'207': palm tree
'208': panda
'209': pants
'210': paper clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup truck
'221': picture frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote control
'242': rhinoceros
'243': rifle
'244': river
'245': roller coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea turtle
'256': see saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping bag
'267': smiley face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop sign
'289': stove
'290': strawberry
'291': streetlight
'292': string bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing set
'299': sword
'300': syringe
'301': t-shirt
'302': table
'303': teapot
'304': teddy-bear
'305': telephone
'306': television
'307': tennis racquet
'308': tent
'309': The Eiffel Tower
'310': The Great Wall of China
'311': The Mona Lisa
'312': tiger
'313': toaster
'314': toe
'315': toilet
'316': tooth
'317': toothbrush
'318': toothpaste
'319': tornado
'320': tractor
'321': traffic light
'322': train
'323': tree
'324': triangle
'325': trombone
'326': truck
'327': trumpet
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine bottle
'340': wine glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
splits:
- name: train
num_bytes: 20372624628
num_examples: 50426266
download_size: 39534220144
dataset_size: 20372624628
- config_name: sketch_rnn
features:
- name: word
dtype:
class_label:
names:
'0': aircraft carrier
'1': airplane
'2': alarm clock
'3': ambulance
'4': angel
'5': animal migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball bat
'17': baseball
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling fan
'66': cell phone
'67': cello
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip flops
'120': floor lamp
'121': flower
'122': flying saucer
'123': foot
'124': fork
'125': frog
'126': frying pan
'127': garden hose
'128': garden
'129': giraffe
'130': goatee
'131': golf club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey puck
'146': hockey stick
'147': horse
'148': hospital
'149': hot air balloon
'150': hot dog
'151': hot tub
'152': hourglass
'153': house plant
'154': house
'155': hurricane
'156': ice cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paint can
'206': paintbrush
'207': palm tree
'208': panda
'209': pants
'210': paper clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup truck
'221': picture frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote control
'242': rhinoceros
'243': rifle
'244': river
'245': roller coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea turtle
'256': see saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping bag
'267': smiley face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop sign
'289': stove
'290': strawberry
'291': streetlight
'292': string bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing set
'299': sword
'300': syringe
'301': t-shirt
'302': table
'303': teapot
'304': teddy-bear
'305': telephone
'306': television
'307': tennis racquet
'308': tent
'309': The Eiffel Tower
'310': The Great Wall of China
'311': The Mona Lisa
'312': tiger
'313': toaster
'314': toe
'315': toilet
'316': tooth
'317': toothbrush
'318': toothpaste
'319': tornado
'320': tractor
'321': traffic light
'322': train
'323': tree
'324': triangle
'325': trombone
'326': truck
'327': trumpet
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine bottle
'340': wine glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
- name: drawing
dtype:
array2_d:
shape:
- 3
dtype: int16
splits:
- name: train
num_bytes: 13056229420
num_examples: 24150000
- name: validation
num_bytes: 466485546
num_examples: 862500
- name: test
num_bytes: 466191706
num_examples: 862500
download_size: 3928904911
dataset_size: 13988906672
- config_name: sketch_rnn_full
features:
- name: word
dtype:
class_label:
names:
'0': aircraft carrier
'1': airplane
'2': alarm clock
'3': ambulance
'4': angel
'5': animal migration
'6': ant
'7': anvil
'8': apple
'9': arm
'10': asparagus
'11': axe
'12': backpack
'13': banana
'14': bandage
'15': barn
'16': baseball bat
'17': baseball
'18': basket
'19': basketball
'20': bat
'21': bathtub
'22': beach
'23': bear
'24': beard
'25': bed
'26': bee
'27': belt
'28': bench
'29': bicycle
'30': binoculars
'31': bird
'32': birthday cake
'33': blackberry
'34': blueberry
'35': book
'36': boomerang
'37': bottlecap
'38': bowtie
'39': bracelet
'40': brain
'41': bread
'42': bridge
'43': broccoli
'44': broom
'45': bucket
'46': bulldozer
'47': bus
'48': bush
'49': butterfly
'50': cactus
'51': cake
'52': calculator
'53': calendar
'54': camel
'55': camera
'56': camouflage
'57': campfire
'58': candle
'59': cannon
'60': canoe
'61': car
'62': carrot
'63': castle
'64': cat
'65': ceiling fan
'66': cell phone
'67': cello
'68': chair
'69': chandelier
'70': church
'71': circle
'72': clarinet
'73': clock
'74': cloud
'75': coffee cup
'76': compass
'77': computer
'78': cookie
'79': cooler
'80': couch
'81': cow
'82': crab
'83': crayon
'84': crocodile
'85': crown
'86': cruise ship
'87': cup
'88': diamond
'89': dishwasher
'90': diving board
'91': dog
'92': dolphin
'93': donut
'94': door
'95': dragon
'96': dresser
'97': drill
'98': drums
'99': duck
'100': dumbbell
'101': ear
'102': elbow
'103': elephant
'104': envelope
'105': eraser
'106': eye
'107': eyeglasses
'108': face
'109': fan
'110': feather
'111': fence
'112': finger
'113': fire hydrant
'114': fireplace
'115': firetruck
'116': fish
'117': flamingo
'118': flashlight
'119': flip flops
'120': floor lamp
'121': flower
'122': flying saucer
'123': foot
'124': fork
'125': frog
'126': frying pan
'127': garden hose
'128': garden
'129': giraffe
'130': goatee
'131': golf club
'132': grapes
'133': grass
'134': guitar
'135': hamburger
'136': hammer
'137': hand
'138': harp
'139': hat
'140': headphones
'141': hedgehog
'142': helicopter
'143': helmet
'144': hexagon
'145': hockey puck
'146': hockey stick
'147': horse
'148': hospital
'149': hot air balloon
'150': hot dog
'151': hot tub
'152': hourglass
'153': house plant
'154': house
'155': hurricane
'156': ice cream
'157': jacket
'158': jail
'159': kangaroo
'160': key
'161': keyboard
'162': knee
'163': knife
'164': ladder
'165': lantern
'166': laptop
'167': leaf
'168': leg
'169': light bulb
'170': lighter
'171': lighthouse
'172': lightning
'173': line
'174': lion
'175': lipstick
'176': lobster
'177': lollipop
'178': mailbox
'179': map
'180': marker
'181': matches
'182': megaphone
'183': mermaid
'184': microphone
'185': microwave
'186': monkey
'187': moon
'188': mosquito
'189': motorbike
'190': mountain
'191': mouse
'192': moustache
'193': mouth
'194': mug
'195': mushroom
'196': nail
'197': necklace
'198': nose
'199': ocean
'200': octagon
'201': octopus
'202': onion
'203': oven
'204': owl
'205': paint can
'206': paintbrush
'207': palm tree
'208': panda
'209': pants
'210': paper clip
'211': parachute
'212': parrot
'213': passport
'214': peanut
'215': pear
'216': peas
'217': pencil
'218': penguin
'219': piano
'220': pickup truck
'221': picture frame
'222': pig
'223': pillow
'224': pineapple
'225': pizza
'226': pliers
'227': police car
'228': pond
'229': pool
'230': popsicle
'231': postcard
'232': potato
'233': power outlet
'234': purse
'235': rabbit
'236': raccoon
'237': radio
'238': rain
'239': rainbow
'240': rake
'241': remote control
'242': rhinoceros
'243': rifle
'244': river
'245': roller coaster
'246': rollerskates
'247': sailboat
'248': sandwich
'249': saw
'250': saxophone
'251': school bus
'252': scissors
'253': scorpion
'254': screwdriver
'255': sea turtle
'256': see saw
'257': shark
'258': sheep
'259': shoe
'260': shorts
'261': shovel
'262': sink
'263': skateboard
'264': skull
'265': skyscraper
'266': sleeping bag
'267': smiley face
'268': snail
'269': snake
'270': snorkel
'271': snowflake
'272': snowman
'273': soccer ball
'274': sock
'275': speedboat
'276': spider
'277': spoon
'278': spreadsheet
'279': square
'280': squiggle
'281': squirrel
'282': stairs
'283': star
'284': steak
'285': stereo
'286': stethoscope
'287': stitches
'288': stop sign
'289': stove
'290': strawberry
'291': streetlight
'292': string bean
'293': submarine
'294': suitcase
'295': sun
'296': swan
'297': sweater
'298': swing set
'299': sword
'300': syringe
'301': t-shirt
'302': table
'303': teapot
'304': teddy-bear
'305': telephone
'306': television
'307': tennis racquet
'308': tent
'309': The Eiffel Tower
'310': The Great Wall of China
'311': The Mona Lisa
'312': tiger
'313': toaster
'314': toe
'315': toilet
'316': tooth
'317': toothbrush
'318': toothpaste
'319': tornado
'320': tractor
'321': traffic light
'322': train
'323': tree
'324': triangle
'325': trombone
'326': truck
'327': trumpet
'328': umbrella
'329': underwear
'330': van
'331': vase
'332': violin
'333': washing machine
'334': watermelon
'335': waterslide
'336': whale
'337': wheel
'338': windmill
'339': wine bottle
'340': wine glass
'341': wristwatch
'342': yoga
'343': zebra
'344': zigzag
- name: drawing
dtype:
array2_d:
shape:
- 3
dtype: int16
splits:
- name: train
num_bytes: 23725242280
num_examples: 43988874
- name: validation
num_bytes: 466485546
num_examples: 862500
- name: test
num_bytes: 466191706
num_examples: 862500
download_size: 6928245966
dataset_size: 24657919532
---
# Dataset Card for Quick, Draw!
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Quick, Draw! homepage](https://quickdraw.withgoogle.com/data)
- **Repository:** [Quick, Draw! repository](https://github.com/googlecreativelab/quickdraw-dataset)
- **Paper:** [A Neural Representation of Sketch Drawings](https://arxiv.org/abs/1704.03477v4)
- **Leaderboard:** [Quick, Draw! Doodle Recognition Challenge](https://www.kaggle.com/competitions/quickdraw-doodle-recognition/leaderboard)
- **Point of Contact:** [Quick, Draw! support](mailto:quickdraw-support@google.com)
### Dataset Summary
The Quick Draw Dataset is a collection of 50 million drawings across 345 categories, contributed by players of the game Quick, Draw!. The drawings were captured as timestamped vectors, tagged with metadata including what the player was asked to draw and in which country the player was located.
### Supported Tasks and Leaderboards
- `image-classification`: The goal of this task is to classify a given sketch into one of 345 classes.
The (closed) leaderboard for this task is available [here](https://www.kaggle.com/competitions/quickdraw-doodle-recognition/leaderboard).
### Languages
English.
## Dataset Structure
### Data Instances
#### `raw`
A data point comprises a drawing and its metadata.
```
{
'key_id': '5475678961008640',
'word': 0,
'recognized': True,
'timestamp': datetime.datetime(2017, 3, 28, 13, 28, 0, 851730),
'countrycode': 'MY',
'drawing': {
'x': [[379.0, 380.0, 381.0, 381.0, 381.0, 381.0, 382.0], [362.0, 368.0, 375.0, 380.0, 388.0, 393.0, 399.0, 404.0, 409.0, 410.0, 410.0, 405.0, 397.0, 392.0, 384.0, 377.0, 370.0, 363.0, 356.0, 348.0, 342.0, 336.0, 333.0], ..., [477.0, 473.0, 471.0, 469.0, 468.0, 466.0, 464.0, 462.0, 461.0, 469.0, 475.0, 483.0, 491.0, 499.0, 510.0, 521.0, 531.0, 540.0, 548.0, 558.0, 566.0, 576.0, 583.0, 590.0, 595.0, 598.0, 597.0, 596.0, 594.0, 592.0, 590.0, 589.0, 588.0, 586.0]],
'y': [[1.0, 7.0, 15.0, 21.0, 27.0, 32.0, 32.0], [17.0, 17.0, 17.0, 17.0, 16.0, 16.0, 16.0, 16.0, 18.0, 23.0, 29.0, 32.0, 32.0, 32.0, 29.0, 27.0, 25.0, 23.0, 21.0, 19.0, 17.0, 16.0, 14.0], ..., [151.0, 146.0, 139.0, 131.0, 125.0, 119.0, 113.0, 107.0, 102.0, 99.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 98.0, 100.0, 102.0, 104.0, 105.0, 110.0, 115.0, 121.0, 126.0, 131.0, 137.0, 142.0, 148.0, 150.0]],
't': [[0, 84, 100, 116, 132, 148, 260], [573, 636, 652, 660, 676, 684, 701, 724, 796, 838, 860, 956, 973, 979, 989, 995, 1005, 1012, 1020, 1028, 1036, 1053, 1118], ..., [8349, 8446, 8468, 8484, 8500, 8516, 8541, 8557, 8573, 8685, 8693, 8702, 8710, 8718, 8724, 8732, 8741, 8748, 8757, 8764, 8773, 8780, 8788, 8797, 8804, 8965, 8996, 9029, 9045, 9061, 9076, 9092, 9109, 9167]]
}
}
```
#### `preprocessed_simplified_drawings`
The simplified version of the dataset generated from the `raw` data with the simplified vectors, removed timing information, and the data positioned and scaled into a 256x256 region.
The simplification process was:
1.Align the drawing to the top-left corner, to have minimum values of 0.
2.Uniformly scale the drawing, to have a maximum value of 255.
3.Resample all strokes with a 1 pixel spacing.
4.Simplify all strokes using the [Ramer-Douglas-Peucker algorithm](https://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm) with an epsilon value of 2.0.
```
{
'key_id': '5475678961008640',
'word': 0,
'recognized': True,
'timestamp': datetime.datetime(2017, 3, 28, 15, 28),
'countrycode': 'MY',
'drawing': {
'x': [[31, 32], [27, 37, 38, 35, 21], [25, 28, 38, 39], [33, 34, 32], [5, 188, 254, 251, 241, 185, 45, 9, 0], [35, 35, 43, 125, 126], [35, 76, 80, 77], [53, 50, 54, 80, 78]],
'y': [[0, 7], [4, 4, 6, 7, 3], [5, 10, 10, 7], [4, 33, 44], [50, 50, 54, 83, 86, 90, 86, 77, 52], [85, 91, 92, 96, 90], [35, 37, 41, 47], [34, 23, 22, 23, 34]]
}
}
```
#### `preprocessed_bitmaps` (default configuration)
This configuration contains the 28x28 grayscale bitmap images that were generated from the simplified data, but are aligned to the center of the drawing's bounding box rather than the top-left corner. The code that was used for generation is available [here](https://github.com/googlecreativelab/quickdraw-dataset/issues/19#issuecomment-402247262).
```
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=L size=28x28 at 0x10B5B102828>,
'label': 0
}
```
#### `sketch_rnn` and `sketch_rnn_full`
The `sketch_rnn_full` configuration stores the data in the format suitable for inputs into a recurrent neural network and was used for for training the [Sketch-RNN](https://arxiv.org/abs/1704.03477) model. Unlike `sketch_rnn` where the samples have been randomly selected from each category, the `sketch_rnn_full` configuration contains the full data for each category.
```
{
'word': 0,
'drawing': [[132, 0, 0], [23, 4, 0], [61, 1, 0], [76, 0, 0], [22, -4, 0], [152, 0, 0], [50, -5, 0], [36, -10, 0], [8, 26, 0], [0, 69, 0], [-2, 11, 0], [-8, 10, 0], [-56, 24, 0], [-23, 14, 0], [-99, 40, 0], [-45, 6, 0], [-21, 6, 0], [-170, 2, 0], [-81, 0, 0], [-29, -9, 0], [-94, -19, 0], [-48, -24, 0], [-6, -16, 0], [2, -36, 0], [7, -29, 0], [23, -45, 0], [13, -6, 0], [41, -8, 0], [42, -2, 1], [392, 38, 0], [2, 19, 0], [11, 33, 0], [13, 0, 0], [24, -9, 0], [26, -27, 0], [0, -14, 0], [-8, -10, 0], [-18, -5, 0], [-14, 1, 0], [-23, 4, 0], [-21, 12, 1], [-152, 18, 0], [10, 46, 0], [26, 6, 0], [38, 0, 0], [31, -2, 0], [7, -2, 0], [4, -6, 0], [-10, -21, 0], [-2, -33, 0], [-6, -11, 0], [-46, 1, 0], [-39, 18, 0], [-19, 4, 1], [-122, 0, 0], [-2, 38, 0], [4, 16, 0], [6, 4, 0], [78, 0, 0], [4, -8, 0], [-8, -36, 0], [0, -22, 0], [-6, -2, 0], [-32, 14, 0], [-58, 13, 1], [-96, -12, 0], [-10, 27, 0], [2, 32, 0], [102, 0, 0], [1, -7, 0], [-27, -17, 0], [-4, -6, 0], [-1, -34, 0], [-64, 8, 1], [129, -138, 0], [-108, 0, 0], [-8, 12, 0], [-1, 15, 0], [12, 15, 0], [20, 5, 0], [61, -3, 0], [24, 6, 0], [19, 0, 0], [5, -4, 0], [2, 14, 1]]
}
```
### Data Fields
#### `raw`
- `key_id`: A unique identifier across all drawings.
- `word`: Category the player was prompted to draw.
- `recognized`: Whether the word was recognized by the game.
- `timestamp`: When the drawing was created.
- `countrycode`: A two letter country code ([ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2)) of where the player was located.
- `drawing`: A dictionary where `x` and `y` are the pixel coordinates, and `t` is the time in milliseconds since the first point. `x` and `y` are real-valued while `t` is an integer. `x`, `y` and `t` match in lenght and are represented as lists of lists where each sublist corresponds to a single stroke. The raw drawings can have vastly different bounding boxes and number of points due to the different devices used for display and input.
#### `preprocessed_simplified_drawings`
- `key_id`: A unique identifier across all drawings.
- `word`: Category the player was prompted to draw.
- `recognized`: Whether the word was recognized by the game.
- `timestamp`: When the drawing was created.
- `countrycode`: A two letter country code ([ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2)) of where the player was located.
- `drawing`: A simplified drawing represented as a dictionary where `x` and `y` are the pixel coordinates. The simplification processed is described in the `Data Instances` section.
#### `preprocessed_bitmaps` (default configuration)
- `image`: A `PIL.Image.Image` object containing the 28x28 grayscale bitmap. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `label`: Category the player was prompted to draw.
<details>
<summary>
Click here to see the full class labels mapping:
</summary>
|id|class|
|---|---|
|0|aircraft carrier|
|1|airplane|
|2|alarm clock|
|3|ambulance|
|4|angel|
|5|animal migration|
|6|ant|
|7|anvil|
|8|apple|
|9|arm|
|10|asparagus|
|11|axe|
|12|backpack|
|13|banana|
|14|bandage|
|15|barn|
|16|baseball bat|
|17|baseball|
|18|basket|
|19|basketball|
|20|bat|
|21|bathtub|
|22|beach|
|23|bear|
|24|beard|
|25|bed|
|26|bee|
|27|belt|
|28|bench|
|29|bicycle|
|30|binoculars|
|31|bird|
|32|birthday cake|
|33|blackberry|
|34|blueberry|
|35|book|
|36|boomerang|
|37|bottlecap|
|38|bowtie|
|39|bracelet|
|40|brain|
|41|bread|
|42|bridge|
|43|broccoli|
|44|broom|
|45|bucket|
|46|bulldozer|
|47|bus|
|48|bush|
|49|butterfly|
|50|cactus|
|51|cake|
|52|calculator|
|53|calendar|
|54|camel|
|55|camera|
|56|camouflage|
|57|campfire|
|58|candle|
|59|cannon|
|60|canoe|
|61|car|
|62|carrot|
|63|castle|
|64|cat|
|65|ceiling fan|
|66|cell phone|
|67|cello|
|68|chair|
|69|chandelier|
|70|church|
|71|circle|
|72|clarinet|
|73|clock|
|74|cloud|
|75|coffee cup|
|76|compass|
|77|computer|
|78|cookie|
|79|cooler|
|80|couch|
|81|cow|
|82|crab|
|83|crayon|
|84|crocodile|
|85|crown|
|86|cruise ship|
|87|cup|
|88|diamond|
|89|dishwasher|
|90|diving board|
|91|dog|
|92|dolphin|
|93|donut|
|94|door|
|95|dragon|
|96|dresser|
|97|drill|
|98|drums|
|99|duck|
|100|dumbbell|
|101|ear|
|102|elbow|
|103|elephant|
|104|envelope|
|105|eraser|
|106|eye|
|107|eyeglasses|
|108|face|
|109|fan|
|110|feather|
|111|fence|
|112|finger|
|113|fire hydrant|
|114|fireplace|
|115|firetruck|
|116|fish|
|117|flamingo|
|118|flashlight|
|119|flip flops|
|120|floor lamp|
|121|flower|
|122|flying saucer|
|123|foot|
|124|fork|
|125|frog|
|126|frying pan|
|127|garden hose|
|128|garden|
|129|giraffe|
|130|goatee|
|131|golf club|
|132|grapes|
|133|grass|
|134|guitar|
|135|hamburger|
|136|hammer|
|137|hand|
|138|harp|
|139|hat|
|140|headphones|
|141|hedgehog|
|142|helicopter|
|143|helmet|
|144|hexagon|
|145|hockey puck|
|146|hockey stick|
|147|horse|
|148|hospital|
|149|hot air balloon|
|150|hot dog|
|151|hot tub|
|152|hourglass|
|153|house plant|
|154|house|
|155|hurricane|
|156|ice cream|
|157|jacket|
|158|jail|
|159|kangaroo|
|160|key|
|161|keyboard|
|162|knee|
|163|knife|
|164|ladder|
|165|lantern|
|166|laptop|
|167|leaf|
|168|leg|
|169|light bulb|
|170|lighter|
|171|lighthouse|
|172|lightning|
|173|line|
|174|lion|
|175|lipstick|
|176|lobster|
|177|lollipop|
|178|mailbox|
|179|map|
|180|marker|
|181|matches|
|182|megaphone|
|183|mermaid|
|184|microphone|
|185|microwave|
|186|monkey|
|187|moon|
|188|mosquito|
|189|motorbike|
|190|mountain|
|191|mouse|
|192|moustache|
|193|mouth|
|194|mug|
|195|mushroom|
|196|nail|
|197|necklace|
|198|nose|
|199|ocean|
|200|octagon|
|201|octopus|
|202|onion|
|203|oven|
|204|owl|
|205|paint can|
|206|paintbrush|
|207|palm tree|
|208|panda|
|209|pants|
|210|paper clip|
|211|parachute|
|212|parrot|
|213|passport|
|214|peanut|
|215|pear|
|216|peas|
|217|pencil|
|218|penguin|
|219|piano|
|220|pickup truck|
|221|picture frame|
|222|pig|
|223|pillow|
|224|pineapple|
|225|pizza|
|226|pliers|
|227|police car|
|228|pond|
|229|pool|
|230|popsicle|
|231|postcard|
|232|potato|
|233|power outlet|
|234|purse|
|235|rabbit|
|236|raccoon|
|237|radio|
|238|rain|
|239|rainbow|
|240|rake|
|241|remote control|
|242|rhinoceros|
|243|rifle|
|244|river|
|245|roller coaster|
|246|rollerskates|
|247|sailboat|
|248|sandwich|
|249|saw|
|250|saxophone|
|251|school bus|
|252|scissors|
|253|scorpion|
|254|screwdriver|
|255|sea turtle|
|256|see saw|
|257|shark|
|258|sheep|
|259|shoe|
|260|shorts|
|261|shovel|
|262|sink|
|263|skateboard|
|264|skull|
|265|skyscraper|
|266|sleeping bag|
|267|smiley face|
|268|snail|
|269|snake|
|270|snorkel|
|271|snowflake|
|272|snowman|
|273|soccer ball|
|274|sock|
|275|speedboat|
|276|spider|
|277|spoon|
|278|spreadsheet|
|279|square|
|280|squiggle|
|281|squirrel|
|282|stairs|
|283|star|
|284|steak|
|285|stereo|
|286|stethoscope|
|287|stitches|
|288|stop sign|
|289|stove|
|290|strawberry|
|291|streetlight|
|292|string bean|
|293|submarine|
|294|suitcase|
|295|sun|
|296|swan|
|297|sweater|
|298|swing set|
|299|sword|
|300|syringe|
|301|t-shirt|
|302|table|
|303|teapot|
|304|teddy-bear|
|305|telephone|
|306|television|
|307|tennis racquet|
|308|tent|
|309|The Eiffel Tower|
|310|The Great Wall of China|
|311|The Mona Lisa|
|312|tiger|
|313|toaster|
|314|toe|
|315|toilet|
|316|tooth|
|317|toothbrush|
|318|toothpaste|
|319|tornado|
|320|tractor|
|321|traffic light|
|322|train|
|323|tree|
|324|triangle|
|325|trombone|
|326|truck|
|327|trumpet|
|328|umbrella|
|329|underwear|
|330|van|
|331|vase|
|332|violin|
|333|washing machine|
|334|watermelon|
|335|waterslide|
|336|whale|
|337|wheel|
|338|windmill|
|339|wine bottle|
|340|wine glass|
|341|wristwatch|
|342|yoga|
|343|zebra|
|344|zigzag|
</details>
#### `sketch_rnn` and `sketch_rnn_full`
- `word`: Category the player was prompted to draw.
- `drawing`: An array of strokes. Strokes are represented as 3-tuples consisting of x-offset, y-offset, and a binary variable which is 1 if the pen is lifted between this position and the next, and 0 otherwise.
<details>
<summary>
Click here to see the code for visualizing drawings in Jupyter Notebook or Google Colab:
</summary>
```python
import numpy as np
import svgwrite # pip install svgwrite
from IPython.display import SVG, display
def draw_strokes(drawing, factor=0.045):
"""Displays vector drawing as SVG.
Args:
drawing: a list of strokes represented as 3-tuples
factor: scaling factor. The smaller the scaling factor, the bigger the SVG picture and vice versa.
"""
def get_bounds(data, factor):
"""Return bounds of data."""
min_x = 0
max_x = 0
min_y = 0
max_y = 0
abs_x = 0
abs_y = 0
for i in range(len(data)):
x = float(data[i, 0]) / factor
y = float(data[i, 1]) / factor
abs_x += x
abs_y += y
min_x = min(min_x, abs_x)
min_y = min(min_y, abs_y)
max_x = max(max_x, abs_x)
max_y = max(max_y, abs_y)
return (min_x, max_x, min_y, max_y)
data = np.array(drawing)
min_x, max_x, min_y, max_y = get_bounds(data, factor)
dims = (50 + max_x - min_x, 50 + max_y - min_y)
dwg = svgwrite.Drawing(size=dims)
dwg.add(dwg.rect(insert=(0, 0), size=dims,fill='white'))
lift_pen = 1
abs_x = 25 - min_x
abs_y = 25 - min_y
p = "M%s,%s " % (abs_x, abs_y)
command = "m"
for i in range(len(data)):
if (lift_pen == 1):
command = "m"
elif (command != "l"):
command = "l"
else:
command = ""
x = float(data[i,0])/factor
y = float(data[i,1])/factor
lift_pen = data[i, 2]
p += command+str(x)+","+str(y)+" "
the_color = "black"
stroke_width = 1
dwg.add(dwg.path(p).stroke(the_color,stroke_width).fill("none"))
display(SVG(dwg.tostring()))
```
</details>
> **Note**: Sketch-RNN takes for input strokes represented as 5-tuples with drawings padded to a common maximum length and prefixed by the special start token `[0, 0, 1, 0, 0]`. The 5-tuple representation consists of x-offset, y-offset, and p_1, p_2, p_3, a binary one-hot vector of 3 possible pen states: pen down, pen up, end of sketch. More precisely, the first two elements are the offset distance in the x and y directions of the pen from the previous point. The last 3 elements represents a binary one-hot vector of 3 possible states. The first pen state, p1, indicates that the pen is currently touching the paper, and that a line will be drawn connecting the next point with the current point. The second pen state, p2, indicates that the pen will be lifted from the paper after the current point, and that no line will be drawn next. The final pen state, p3, indicates that the drawing has ended, and subsequent points, including the current point, will not be rendered.
><details>
> <summary>
> Click here to see the code for converting drawings to Sketch-RNN input format:
> </summary>
>
> ```python
> def to_sketch_rnn_format(drawing, max_len):
> """Converts a drawing to Sketch-RNN input format.
>
> Args:
> drawing: a list of strokes represented as 3-tuples
> max_len: maximum common length of all drawings
>
> Returns:
> NumPy array
> """
> drawing = np.array(drawing)
> result = np.zeros((max_len, 5), dtype=float)
> l = len(drawing)
> assert l <= max_len
> result[0:l, 0:2] = drawing[:, 0:2]
> result[0:l, 3] = drawing[:, 2]
> result[0:l, 2] = 1 - result[0:l, 3]
> result[l:, 4] = 1
> # Prepend special start token
> result = np.vstack([[0, 0, 1, 0, 0], result])
> return result
> ```
>
></details>
### Data Splits
In the configurations `raw`, `preprocessed_simplified_drawings` and `preprocessed_bitamps` (default configuration), all the data is contained in the training set, which has 50426266 examples.
`sketch_rnn` and `sketch_rnn_full` have the data split into training, validation and test split. In the `sketch_rnn` configuration, 75K samples (70K Training, 2.5K Validation, 2.5K Test) have been randomly selected from each category. Therefore, the training set contains 24150000 examples, the validation set 862500 examples and the test set 862500 examples. The `sketch_rnn_full` configuration has the full (training) data for each category, which leads to the training set having 43988874 examples, the validation set 862500 and the test set 862500 examples.
## Dataset Creation
### Curation Rationale
From the GitHub repository:
> The Quick Draw Dataset is a collection of 50 million drawings across [345 categories](categories.txt), contributed by players of the game [Quick, Draw!](https://quickdraw.withgoogle.com). The drawings were captured as timestamped vectors, tagged with metadata including what the player was asked to draw and in which country the player was located. You can browse the recognized drawings on [quickdraw.withgoogle.com/data](https://quickdraw.withgoogle.com/data).
>
> We're sharing them here for developers, researchers, and artists to explore, study, and learn from
### Source Data
#### Initial Data Collection and Normalization
This dataset contains vector drawings obtained from [Quick, Draw!](https://quickdraw.withgoogle.com/), an online game where the players are asked to draw objects belonging to a particular object class in less than 20 seconds.
#### Who are the source language producers?
The participants in the [Quick, Draw!](https://quickdraw.withgoogle.com/) game.
### Annotations
#### Annotation process
The annotations are machine-generated and match the category the player was prompted to draw.
#### Who are the annotators?
The annotations are machine-generated.
### Personal and Sensitive Information
Some sketches are known to be problematic (see https://github.com/googlecreativelab/quickdraw-dataset/issues/74 and https://github.com/googlecreativelab/quickdraw-dataset/issues/18).
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
## Additional Information
### Dataset Curators
Jonas Jongejan, Henry Rowley, Takashi Kawashima, Jongmin Kim and Nick Fox-Gieg.
### Licensing Information
The data is made available by Google, Inc. under the [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) license.
### Citation Information
```bibtex
@article{DBLP:journals/corr/HaE17,
author = {David Ha and
Douglas Eck},
title = {A Neural Representation of Sketch Drawings},
journal = {CoRR},
volume = {abs/1704.03477},
year = {2017},
url = {http://arxiv.org/abs/1704.03477},
archivePrefix = {arXiv},
eprint = {1704.03477},
timestamp = {Mon, 13 Aug 2018 16:48:30 +0200},
biburl = {https://dblp.org/rec/bib/journals/corr/HaE17},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
odunola/french-preprocessed-2 | ---
dataset_info:
features:
- name: english_transcript
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 10936196016
num_examples: 11386
download_size: 1890718214
dataset_size: 10936196016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jonathansuru/gibby_dataset | ---
license: apache-2.0
task_categories:
- text-classification
- table-question-answering
- question-answering
- summarization
- conversational
- feature-extraction
language:
- en
--- |
JetBrains-Research/lca-module-to-text | ---
dataset_info:
features:
- name: repo
dtype: string
- name: docfile_name
dtype: string
- name: doc_type
dtype: string
- name: intent
dtype: string
- name: license
dtype: string
- name: path_to_docfile
dtype: string
- name: relevant_code_files
sequence: string
- name: relevant_code_dir
dtype: string
- name: target_text
dtype: string
- name: relevant_code_context
dtype: string
splits:
- name: test
num_bytes: 227163668
num_examples: 216
download_size: 30375843
dataset_size: 227163668
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# LCA (module-to-text)
This is the data for module-to-text benchmark as part of LCA.
## How-to
1. List all the available configs via [`datasets.get_dataset_config_names`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.get_dataset_config_names) and choose an appropriate one
3. Load the data via [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.load_dataset):
```
from datasets import load_dataset
dataset = load_dataset("JetBrains-Research/lca-module-to-text")
```
## Dataset Structure
Each example has the following fields:
| **Field** | **Description** |
|:---------------------------:|:----------------------------------------:|
| `repo` | Name of repository |
| `target_text` | Target text (doc file) |
| `docfile_name` | Name of file name with target doc |
| `doc_type` | Type of target doc |
| `intent` | Instruction for generation |
| `license` | License of target repo |
| `relevant_code_files` | Pathes to relevant code files |
| `relevant_code_dir` | Pathes to relevant code dirs |
| `path_to_docfile` | Path to file with documentation |
| `relevant_code_context` | Relevant code context |
Note: you may collect and use your own relevant context. Our context may not be suitable. Folder with zipped repositories can be found in Files and versions |
open-llm-leaderboard/details_nicholasKluge__Aira-124M | ---
pretty_name: Evaluation run of nicholasKluge/Aira-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-124M](https://huggingface.co/nicholasKluge/Aira-124M) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-124M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:04:35.532451](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-124M/blob/main/results_2023-08-29T19%3A04%3A35.532451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25265346552614076,\n\
\ \"acc_stderr\": 0.03117857003137413,\n \"acc_norm\": 0.253799928797708,\n\
\ \"acc_norm_stderr\": 0.03119563902907945,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.41020465472810524,\n\
\ \"mc2_stderr\": 0.015012374839842264\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19880546075085323,\n \"acc_stderr\": 0.01166285019817554,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2921728739294961,\n\
\ \"acc_stderr\": 0.004538319464111971,\n \"acc_norm\": 0.312885879306911,\n\
\ \"acc_norm_stderr\": 0.004627207073171273\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678316,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678316\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.13,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.13,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673622,\n\
\ \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673622\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926763,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926763\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"\
acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n\
\ \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n\
\ \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395594,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395594\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935427,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n\
\ \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n\
\ \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.02419180860071301,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.02419180860071301\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\
\ \"acc_stderr\": 0.011139857833598506,\n \"acc_norm\": 0.25554106910039115,\n\
\ \"acc_norm_stderr\": 0.011139857833598506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.036942843353378,\n\
\ \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.036942843353378\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.03240004825594689,\n\
\ \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.03240004825594689\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.01507721920066259,\n\
\ \"mc2\": 0.41020465472810524,\n \"mc2_stderr\": 0.015012374839842264\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:04:35.532451.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- results_2023-08-29T19:04:35.532451.parquet
- split: latest
path:
- results_2023-08-29T19:04:35.532451.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-124M](https://huggingface.co/nicholasKluge/Aira-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:04:35.532451](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-124M/blob/main/results_2023-08-29T19%3A04%3A35.532451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25265346552614076,
"acc_stderr": 0.03117857003137413,
"acc_norm": 0.253799928797708,
"acc_norm_stderr": 0.03119563902907945,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41020465472810524,
"mc2_stderr": 0.015012374839842264
},
"harness|arc:challenge|25": {
"acc": 0.19880546075085323,
"acc_stderr": 0.01166285019817554,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730107
},
"harness|hellaswag|10": {
"acc": 0.2921728739294961,
"acc_stderr": 0.004538319464111971,
"acc_norm": 0.312885879306911,
"acc_norm_stderr": 0.004627207073171273
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678316,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678316
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.13,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.13,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673622,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673622
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926763,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926763
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.3467889908256881,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395594,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395594
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935427,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.021823422857744953,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.021823422857744953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.02419180860071301,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.02419180860071301
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.011139857833598506,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.011139857833598506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594689,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594689
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41020465472810524,
"mc2_stderr": 0.015012374839842264
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bigbio/n2c2_2018_track2 |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: DUA
pretty_name: n2c2 2018 ADE
homepage: https://portal.dbmi.hms.harvard.edu/projects/n2c2-nlp/
bigbio_pubmed: False
bigbio_public: False
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- RELATION_EXTRACTION
---
# Dataset Card for n2c2 2018 ADE
## Dataset Description
- **Homepage:** https://portal.dbmi.hms.harvard.edu/projects/n2c2-nlp/
- **Pubmed:** False
- **Public:** False
- **Tasks:** NER,RE
The National NLP Clinical Challenges (n2c2), organized in 2018, continued the
legacy of i2b2 (Informatics for Biology and the Bedside), adding 2 new tracks and 2
new sets of data to the shared tasks organized since 2006. Track 2 of 2018
n2c2 shared tasks focused on the extraction of medications, with their signature
information, and adverse drug events (ADEs) from clinical narratives.
This track built on our previous medication challenge, but added a special focus on ADEs.
ADEs are injuries resulting from a medical intervention related to a drugs and
can include allergic reactions, drug interactions, overdoses, and medication errors.
Collectively, ADEs are estimated to account for 30% of all hospital adverse
events; however, ADEs are preventable. Identifying potential drug interactions,
overdoses, allergies, and errors at the point of care and alerting the caregivers of
potential ADEs can improve health delivery, reduce the risk of ADEs, and improve health
outcomes.
A step in this direction requires processing narratives of clinical records
that often elaborate on the medications given to a patient, as well as the known
allergies, reactions, and adverse events of the patient. Extraction of this information
from narratives complements the structured medication information that can be
obtained from prescriptions, allowing a more thorough assessment of potential ADEs
before they happen.
The 2018 n2c2 shared task Track 2, hereon referred to as the ADE track,
tackled these natural language processing tasks in 3 different steps,
which we refer to as tasks:
1. Concept Extraction: identification of concepts related to medications,
their signature information, and ADEs
2. Relation Classification: linking the previously mentioned concepts to
their medication by identifying relations on gold standard concepts
3. End-to-End: building end-to-end systems that process raw narrative text
to discover concepts and find relations of those concepts to their medications
Shared tasks provide a venue for head-to-head comparison of systems developed
for the same task and on the same data, allowing researchers to identify the state
of the art in a particular task, learn from it, and build on it.
## Citation Information
```
@article{DBLP:journals/jamia/HenryBFSU20,
author = {
Sam Henry and
Kevin Buchan and
Michele Filannino and
Amber Stubbs and
Ozlem Uzuner
},
title = {2018 n2c2 shared task on adverse drug events and medication extraction
in electronic health records},
journal = {J. Am. Medical Informatics Assoc.},
volume = {27},
number = {1},
pages = {3--12},
year = {2020},
url = {https://doi.org/10.1093/jamia/ocz166},
doi = {10.1093/jamia/ocz166},
timestamp = {Sat, 30 May 2020 19:53:56 +0200},
biburl = {https://dblp.org/rec/journals/jamia/HenryBFSU20.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
botbot-ai/aya_dataset_pt | ---
language:
- pt
pretty_name: Aya Dataset Portuguese
tags:
- aya
- portuguese
- legal
- chemistry
license: apache-2.0
size_categories:
- 1K<n<10K
---
CohereForAI [Aya Dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) filtrado para português (PT).
**Aya Dataset Summary**
The [Aya Dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) is a multilingual instruction fine-tuning dataset curated by an open-science community via Aya Annotation Platform from Cohere For AI. The dataset contains a total of 204k human-annotated prompt-completion pairs along with the demographics data of the annotators.
This dataset can be used to train, finetune, and evaluate multilingual LLMs.
Curated by: Contributors of Aya Open Science Intiative.
Language(s): 65 languages (71 including dialects & scripts).
License: Apache 2.0 |
open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B | ---
pretty_name: Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Instruct-v0.2-Seraph-7B](https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T13:51:56.485977](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B/blob/main/results_2023-12-13T13-51-56.485977.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.630546295773675,\n\
\ \"acc_stderr\": 0.03269222189548608,\n \"acc_norm\": 0.6329595358982607,\n\
\ \"acc_norm_stderr\": 0.03335128778096069,\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6539318892450207,\n\
\ \"mc2_stderr\": 0.015152914709562705\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414046,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598672\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6605257916749652,\n\
\ \"acc_stderr\": 0.004725630911520329,\n \"acc_norm\": 0.8419637522405895,\n\
\ \"acc_norm_stderr\": 0.003640294912838693\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381401,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381401\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n\
\ \"acc_stderr\": 0.016714890379996062,\n \"acc_norm\": 0.4849162011173184,\n\
\ \"acc_norm_stderr\": 0.016714890379996062\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.019412539242032165,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.019412539242032165\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.02796267760476891,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.02796267760476891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n\
\ \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6539318892450207,\n\
\ \"mc2_stderr\": 0.015152914709562705\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987736\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5443517816527672,\n \
\ \"acc_stderr\": 0.013718194542485601\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-51-56.485977.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- '**/details_harness|winogrande|5_2023-12-13T13-51-56.485977.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T13-51-56.485977.parquet'
- config_name: results
data_files:
- split: 2023_12_13T13_51_56.485977
path:
- results_2023-12-13T13-51-56.485977.parquet
- split: latest
path:
- results_2023-12-13T13-51-56.485977.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Instruct-v0.2-Seraph-7B](https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:51:56.485977](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B/blob/main/results_2023-12-13T13-51-56.485977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.630546295773675,
"acc_stderr": 0.03269222189548608,
"acc_norm": 0.6329595358982607,
"acc_norm_stderr": 0.03335128778096069,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6539318892450207,
"mc2_stderr": 0.015152914709562705
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414046,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598672
},
"harness|hellaswag|10": {
"acc": 0.6605257916749652,
"acc_stderr": 0.004725630911520329,
"acc_norm": 0.8419637522405895,
"acc_norm_stderr": 0.003640294912838693
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381401,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381401
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.016714890379996062,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.016714890379996062
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.019412539242032165,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.019412539242032165
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476891,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6539318892450207,
"mc2_stderr": 0.015152914709562705
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987736
},
"harness|gsm8k|5": {
"acc": 0.5443517816527672,
"acc_stderr": 0.013718194542485601
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Astral-P/LucyHeartfilia | ---
license: wtfpl
---
|
adamjweintraut/lyrlen_data | ---
dataset_info:
features:
- name: title
dtype: string
- name: lyrics
dtype: string
- name: id
dtype: int64
- name: genre
dtype: string
- name: sylls
dtype: string
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 162247767
num_examples: 40000
- name: test
num_bytes: 19475238
num_examples: 5000
- name: valid
num_bytes: 21003078
num_examples: 5000
download_size: 97307063
dataset_size: 202726083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
matallanas/dreambooth-dotnet | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 5743920.0
num_examples: 28
download_size: 5744431
dataset_size: 5743920.0
---
# Dataset Card for "dreambooth-dotnet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nkasmanoff/huggingface-datasets | ---
dataset_info:
features:
- name: id
dtype: string
- name: private
dtype: bool
- name: tags
sequence: string
- name: description
dtype: string
- name: downloads
dtype: int64
- name: likes
dtype: int64
splits:
- name: train
num_bytes: 4086603
num_examples: 30135
download_size: 1437762
dataset_size: 4086603
---
# Dataset Card for "huggingface-datasets"
This dataset is a snapshot of all public datasets in HuggingFace as of 04/24/2023. It is based on the dataset metadata that can be found at the following endpoint:
https://huggingface.co/api/datasets/{dataset_id}
Which contains information like the dataset name, its tags, description, and more. Please note that description is different from dataset card, which is what you are reading now :-).
I would love to replace this dataset with one which uses dataset card instead of description, but that is not something I can scrape in a realistic amount of time. In any case, this data still contains some useful
metadata about the datasets on HF, and can be used for a variety of downstream tasks. Please like if you enjoy <3.
For more insight into how this data was collected and how it can be used, please checkout the repository here: https://github.com/nkasmanoff/searching-face
I did not check all descriptions in this > 30k sample dataset. Most are null, but it is possible that some may be NSFW. Please use responsibly.
|
PepBun/exampledataset3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 28604165.0
num_examples: 24
download_size: 28492538
dataset_size: 28604165.0
---
# Dataset Card for "exampledataset3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dooooosooooo/ai-chatbot-scrubee | ---
license: unknown
task_categories:
- question-answering
language:
- ko
--- |
easytpp/taobao | ---
license: apache-2.0
---
|
sankovic/shirimteste | ---
license: openrail
---
|
Leo12344321/synthetic_realfakeimage | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.