datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Nkumar5/FMARock | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 79535716.375
num_examples: 1805
download_size: 79512581
dataset_size: 79535716.375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_197 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 16920007776.75
num_examples: 176162
download_size: 15321877063
dataset_size: 16920007776.75
---
# Dataset Card for "chunk_197"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_varox34__minillm-7B-init-13B-sft | ---
pretty_name: Evaluation run of varox34/minillm-7B-init-13B-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [varox34/minillm-7B-init-13B-sft](https://huggingface.co/varox34/minillm-7B-init-13B-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_varox34__minillm-7B-init-13B-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T15:56:46.315091](https://huggingface.co/datasets/open-llm-leaderboard/details_varox34__minillm-7B-init-13B-sft/blob/main/results_2024-04-15T15-56-46.315091.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.35462956475919355,\n\
\ \"acc_stderr\": 0.03344663620513609,\n \"acc_norm\": 0.3571839751503999,\n\
\ \"acc_norm_stderr\": 0.034231531268842284,\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.01436214815569047,\n \"mc2\": 0.3356507077011393,\n\
\ \"mc2_stderr\": 0.014023095423451806\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\
\ \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.014595873205358262\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5846444931288588,\n\
\ \"acc_stderr\": 0.004917761181740164,\n \"acc_norm\": 0.7779326827325234,\n\
\ \"acc_norm_stderr\": 0.004147867246653342\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41132075471698115,\n \"acc_stderr\": 0.0302850092590098,\n\
\ \"acc_norm\": 0.41132075471698115,\n \"acc_norm_stderr\": 0.0302850092590098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.03124532520276193,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.03124532520276193\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770862,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770862\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.34516129032258064,\n \"acc_stderr\": 0.027045746573534327,\n \"\
acc_norm\": 0.34516129032258064,\n \"acc_norm_stderr\": 0.027045746573534327\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271847,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271847\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3686868686868687,\n \"acc_stderr\": 0.03437305501980619,\n \"\
acc_norm\": 0.3686868686868687,\n \"acc_norm_stderr\": 0.03437305501980619\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.48186528497409326,\n \"acc_stderr\": 0.03606065001832919,\n\
\ \"acc_norm\": 0.48186528497409326,\n \"acc_norm_stderr\": 0.03606065001832919\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3230769230769231,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.3230769230769231,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766114,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.27310924369747897,\n \"acc_stderr\": 0.028942004040998164,\n\
\ \"acc_norm\": 0.27310924369747897,\n \"acc_norm_stderr\": 0.028942004040998164\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.42935779816513764,\n\
\ \"acc_stderr\": 0.021222286397236504,\n \"acc_norm\": 0.42935779816513764,\n\
\ \"acc_norm_stderr\": 0.021222286397236504\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293647,\n\
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293647\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.03454236585380608,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03454236585380608\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.047128212574267705,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.047128212574267705\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.037149084099355745,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.037149084099355745\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.49145299145299143,\n\
\ \"acc_stderr\": 0.032751303000970296,\n \"acc_norm\": 0.49145299145299143,\n\
\ \"acc_norm_stderr\": 0.032751303000970296\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.44316730523627074,\n\
\ \"acc_stderr\": 0.01776408503534841,\n \"acc_norm\": 0.44316730523627074,\n\
\ \"acc_norm_stderr\": 0.01776408503534841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961452,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.02718449890994162,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.02718449890994162\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3440514469453376,\n\
\ \"acc_stderr\": 0.026981478043648033,\n \"acc_norm\": 0.3440514469453376,\n\
\ \"acc_norm_stderr\": 0.026981478043648033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36419753086419754,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.36419753086419754,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.028195534873966727,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.028195534873966727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30834419817470665,\n\
\ \"acc_stderr\": 0.011794833789715329,\n \"acc_norm\": 0.30834419817470665,\n\
\ \"acc_norm_stderr\": 0.011794833789715329\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36764705882352944,\n \"acc_stderr\": 0.019506291693954847,\n \
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.019506291693954847\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484375,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484375\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39303482587064675,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.39303482587064675,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.03805797505590459,\n\
\ \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.03805797505590459\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.01436214815569047,\n \"mc2\": 0.3356507077011393,\n\
\ \"mc2_stderr\": 0.014023095423451806\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638256\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07050796057619409,\n \
\ \"acc_stderr\": 0.007051543813983613\n }\n}\n```"
repo_url: https://huggingface.co/varox34/minillm-7B-init-13B-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|arc:challenge|25_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|gsm8k|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hellaswag|10_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-56-46.315091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T15-56-46.315091.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- '**/details_harness|winogrande|5_2024-04-15T15-56-46.315091.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T15-56-46.315091.parquet'
- config_name: results
data_files:
- split: 2024_04_15T15_56_46.315091
path:
- results_2024-04-15T15-56-46.315091.parquet
- split: latest
path:
- results_2024-04-15T15-56-46.315091.parquet
---
# Dataset Card for Evaluation run of varox34/minillm-7B-init-13B-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [varox34/minillm-7B-init-13B-sft](https://huggingface.co/varox34/minillm-7B-init-13B-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_varox34__minillm-7B-init-13B-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T15:56:46.315091](https://huggingface.co/datasets/open-llm-leaderboard/details_varox34__minillm-7B-init-13B-sft/blob/main/results_2024-04-15T15-56-46.315091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.35462956475919355,
"acc_stderr": 0.03344663620513609,
"acc_norm": 0.3571839751503999,
"acc_norm_stderr": 0.034231531268842284,
"mc1": 0.2141982864137087,
"mc1_stderr": 0.01436214815569047,
"mc2": 0.3356507077011393,
"mc2_stderr": 0.014023095423451806
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255793,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.014595873205358262
},
"harness|hellaswag|10": {
"acc": 0.5846444931288588,
"acc_stderr": 0.004917761181740164,
"acc_norm": 0.7779326827325234,
"acc_norm_stderr": 0.004147867246653342
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41132075471698115,
"acc_stderr": 0.0302850092590098,
"acc_norm": 0.41132075471698115,
"acc_norm_stderr": 0.0302850092590098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.03124532520276193,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.03124532520276193
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770862,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770862
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.34516129032258064,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.34516129032258064,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271847,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271847
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3686868686868687,
"acc_stderr": 0.03437305501980619,
"acc_norm": 0.3686868686868687,
"acc_norm_stderr": 0.03437305501980619
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.48186528497409326,
"acc_stderr": 0.03606065001832919,
"acc_norm": 0.48186528497409326,
"acc_norm_stderr": 0.03606065001832919
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3230769230769231,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.3230769230769231,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766114,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.27310924369747897,
"acc_stderr": 0.028942004040998164,
"acc_norm": 0.27310924369747897,
"acc_norm_stderr": 0.028942004040998164
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42935779816513764,
"acc_stderr": 0.021222286397236504,
"acc_norm": 0.42935779816513764,
"acc_norm_stderr": 0.021222286397236504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03454236585380608,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03454236585380608
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.047128212574267705,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.047128212574267705
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.037149084099355745,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.037149084099355745
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.49145299145299143,
"acc_stderr": 0.032751303000970296,
"acc_norm": 0.49145299145299143,
"acc_norm_stderr": 0.032751303000970296
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.44316730523627074,
"acc_stderr": 0.01776408503534841,
"acc_norm": 0.44316730523627074,
"acc_norm_stderr": 0.01776408503534841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.38439306358381503,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.38439306358381503,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961452,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.02718449890994162,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.02718449890994162
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3440514469453376,
"acc_stderr": 0.026981478043648033,
"acc_norm": 0.3440514469453376,
"acc_norm_stderr": 0.026981478043648033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36419753086419754,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.36419753086419754,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.028195534873966727,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.028195534873966727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30834419817470665,
"acc_stderr": 0.011794833789715329,
"acc_norm": 0.30834419817470665,
"acc_norm_stderr": 0.011794833789715329
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39303482587064675,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.39303482587064675,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.03805797505590459,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.03805797505590459
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2141982864137087,
"mc1_stderr": 0.01436214815569047,
"mc2": 0.3356507077011393,
"mc2_stderr": 0.014023095423451806
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638256
},
"harness|gsm8k|5": {
"acc": 0.07050796057619409,
"acc_stderr": 0.007051543813983613
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_228 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1176486232.0
num_examples: 231046
download_size: 1198634191
dataset_size: 1176486232.0
---
# Dataset Card for "chunk_228"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DBQ/Burberry.Product.prices.France | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: France - Burberry - Product-level price list
tags:
- webscraping
- ecommerce
- Burberry
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1084708
num_examples: 3298
download_size: 320925
dataset_size: 1084708
---
# Burberry web scraped data
## About the website
The **Burberry** brand operates within the **luxury fashion industry** in the EMEA region, particularly in **France**. This sector primarily focuses on the creation and retail of upscale clothing and accessories that are well-renowned among high-end consumers. It is characterized by exclusive distribution methods, high-quality materials, and strong branding. France, especially Paris, is recognized as a global center for luxury fashion, underpinned by famous fashion houses, influential designers and high-profile fashion events. The observed dataset comprises of **Ecommerce product-list page (PLP) data** specifically related to Burberry in the French market, reflecting the brands digital presence and sales in the region.
## Link to **dataset**
[France - Burberry - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Burberry%20Product-prices%20France/r/rectNx7vaD6XOvbMI)
|
sam2ai/telgu_alpaca_dolly_67k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: id
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 109797855
num_examples: 67017
download_size: 43050214
dataset_size: 109797855
---
# Dataset Card for "telgu_alpaca_dolly_67k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zaustt59/bulkhead | ---
license: openrail
---
|
Leon-Chang/pysmore_ml_1m_test | ---
dataset_info:
features:
- name: u_id
dtype: int64
- name: i_id
dtype: int64
- name: rating
dtype: int64
- name: timestamp
dtype: int64
splits:
- name: train
num_bytes: 32006688
num_examples: 1000209
download_size: 9428901
dataset_size: 32006688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_DreadPoor__RoyalMaid-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/RoyalMaid-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/RoyalMaid-7B-slerp](https://huggingface.co/DreadPoor/RoyalMaid-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__RoyalMaid-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:41:24.103560](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__RoyalMaid-7B-slerp/blob/main/results_2024-03-01T01-41-24.103560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6515569934780451,\n\
\ \"acc_stderr\": 0.03217978858726048,\n \"acc_norm\": 0.6519936033885554,\n\
\ \"acc_norm_stderr\": 0.03284082148936091,\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6417830943266659,\n\
\ \"mc2_stderr\": 0.015108372362023727\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.01382204792228351,\n\
\ \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246254\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6921927902808206,\n\
\ \"acc_stderr\": 0.0046064296846045315,\n \"acc_norm\": 0.8725353515236008,\n\
\ \"acc_norm_stderr\": 0.0033281118131353823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n\
\ \"acc_stderr\": 0.01632076376380838,\n \"acc_norm\": 0.39106145251396646,\n\
\ \"acc_norm_stderr\": 0.01632076376380838\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.012728446067669971,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.012728446067669971\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6417830943266659,\n\
\ \"mc2_stderr\": 0.015108372362023727\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/RoyalMaid-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-24.103560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-41-24.103560.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- '**/details_harness|winogrande|5_2024-03-01T01-41-24.103560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-41-24.103560.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_41_24.103560
path:
- results_2024-03-01T01-41-24.103560.parquet
- split: latest
path:
- results_2024-03-01T01-41-24.103560.parquet
---
# Dataset Card for Evaluation run of DreadPoor/RoyalMaid-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/RoyalMaid-7B-slerp](https://huggingface.co/DreadPoor/RoyalMaid-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__RoyalMaid-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:41:24.103560](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__RoyalMaid-7B-slerp/blob/main/results_2024-03-01T01-41-24.103560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6515569934780451,
"acc_stderr": 0.03217978858726048,
"acc_norm": 0.6519936033885554,
"acc_norm_stderr": 0.03284082148936091,
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6417830943266659,
"mc2_stderr": 0.015108372362023727
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.01382204792228351,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.013340916085246254
},
"harness|hellaswag|10": {
"acc": 0.6921927902808206,
"acc_stderr": 0.0046064296846045315,
"acc_norm": 0.8725353515236008,
"acc_norm_stderr": 0.0033281118131353823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.01632076376380838,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.01632076376380838
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669971,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669971
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6417830943266659,
"mc2_stderr": 0.015108372362023727
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
relbert/t_rex_relational_similarity | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- n<1K
pretty_name: T-REX for relational similarity
---
# Dataset Card for "relbert/t_rex_relation_similarity"
## Dataset Description
- **Repository:** [RelBERT](https://github.com/asahi417/relbert)
- **Paper:** [https://aclanthology.org/L18-1544/](https://aclanthology.org/L18-1544/)
- **Dataset:** T-REX for relational similarity
## Dataset Summary
This is the clean version of [T-REX](https://aclanthology.org/L18-1544/) converted into relation similarity dataset format.
The original dataset is [`relbert/t_rex`](https://huggingface.co/datasets/relbert/t_rex).
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```shell
{
"relation_type": "[Airline] has a hub in [Location]",
"positives": [["Korean Air", "Seoul"], ["Asiana Airlines", "Seoul"], ["Cathay Pacific", "Hong Kong"], ["Dragonair", "Hong Kong"], ["Qantas", "Singapore"], ["Air China", "Beijing"], ["Singapore Airlines", "Singapore"]],
"negatives": [["joint resolution", "United States Congress"], ["joint resolution", "Congress"], ["Great Seal", "United States"], ["trident", "Ukraine"], ["harp", "Ireland"], ["Plantagenet", "England"], ["Pahonia", "Lithuania"], ["slavery", "American Civil War"], ["main asteroid belt", "Solar System"], ["Colorado Desert", "Sonoran Desert"], ["DNA", "genome"], ["Mars", "Solar System"], ["Manchester United", "red"], ["Kermit", "greenness"], ["Ruby", "red"], ["Liberal Party", "red"], ["Macintosh", "Apple"], ["Apple II", "Apple"], ["Apple III", "Apple"], ["PlayStation 2", "Sony"], ["PlayStation 2", "Sony Computer Entertainment"], ["Beatles", "George Martin"], ["Baku", "Azerbaijan"], ["Accra", "Ghana"], ["Amman", "Jordan"], ["Hannover", "Lower Saxony"], ["Agartala", "Tripura"], ["Makassar", "South Sulawesi"], ["Taiwan", "China"], ["Poland", "United Nations"], ["Poland", "Europe"], ["Poland", "European Union"], ["Poland", "NATO"], ["German invasion", "22 June 1941"], ["Operation Barbarossa", "22 June 1941"], ["Brazil", "Catholic Church"], ["Turkey", "Islam"], ["Afghanistan", "Islam"], ["Iraq", "Islam"], ["Finland", "Evangelical Lutheran Church"], ["England", "Roman Catholic"], ["Congress", "United States"], ["Sejm", "Poland"], ["Diet", "Japan"], ["Majlis", "Iran"], ["Riksdag", "Sweden"], ["Croatian Parliament", "Croatia"], ["Knesset", "Israel"], ["Parliament", "Sri Lanka"], ["Russia", "Soviet Union"], ["Ukrainian SSR", "Soviet Union"], ["Royal Flying Corps", "Royal Air Force"], ["Canadian Army", "Canadian Forces"], ["Belarus", "Russian"], ["Russia", "Russian"], ["Ukraine", "Russian"], ["Kerala", "Malayalam"], ["American", "English"], ["zlib license", "Open Source Initiative"], ["EPL", "Open Source Initiative"], ["GNU General Public License", "Open Source Initiative"], ["Wrigley Field", "Cubs"], ["Wrigley Field", "Chicago Cubs"], ["Yankee Stadium", "Yankees"], ["Passaic River", "Newark Bay"], ["Rocky", "Sylvester Stallone"], ["The Godfather", "Francis Ford Coppola"], ["Citizen Kane", "Orson Welles"], ["She Hate Me", "Spike Lee"], ["Raajneeti", "Prakash Jha"], ["Doctor Who", "Patrick Troughton"], ["Doctor Who", "Tom Baker"], ["Jana Gana Mana", "India"], ["President", "White House"], ["Washington", "Federalist Party"], ["George Washington", "Federalist Party"], ["Joseph Stalin", "Communist Party"], ["Mao Zedong", "Communist Party"], ["Lenin", "Communist Party"], ["Nelson Mandela", "ANC"], ["Putin", "Communist Party"], ["Nehru", "Indian National Congress"], ["Nicolas Sarkozy", "UMP"], ["Andreas Papandreou", "PASOK"], ["Tim Cook", "Apple"], ["Israel", "Isaac"], ["Meg", "Peter"], ["Elizabeth II", "Canada"], ["Victor Emmanuel III", "Italy"], ["Umberto I", "Italy"], ["Victor Emmanuel II", "Italy"], ["Brahms", "pianist"], ["Beethoven", "piano"], ["Nicky Hopkins", "pianist"], ["Mozart", "violin"], ["John Zorn", "saxophonist"], ["McCartney", "piano"], ["Russians", "Russian"], ["The Real McCoys", "CBS"], ["Brookside", "Channel 4"], ["The Real McCoys", "ABC"], ["Windows", "Microsoft"], ["Busan", "Gyeongbu Line"], ["Seoul", "Gyeongbu Line"], ["Springer Mountain", "Appalachian Trail"], ["Doctor Who", "BBC One"], ["central time zone", "Illinois"], ["CT", "Canada"], ["Central Time Zone", "Mexico"], ["Central Time Zone", "United States"], ["CT", "American"], ["CT", "Mexico"], ["CT", "United States"], ["central time zone", "Indiana"], ["Central Time Zone", "American"]]
}
```
### Data Splits
| train |validation| test|
|--------:|---------:|---------:|
| 721| 602 | 24|
## Citation Information
```
@inproceedings{elsahar2018t,
title={T-rex: A large scale alignment of natural language with knowledge base triples},
author={Elsahar, Hady and Vougiouklis, Pavlos and Remaci, Arslen and Gravier, Christophe and Hare, Jonathon and Laforest, Frederique and Simperl, Elena},
booktitle={Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
year={2018}
}
```
|
Kaludi/data-csgo-weapon-classification | ---
task_categories:
- image-classification
---
# Dataset for project: csgo-weapon-classification
## Dataset Description
This dataset has for project csgo-weapon-classification was collected with the help of a bulk google image downloader.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<1768x718 RGB PIL image>",
"target": 0
},
{
"image": "<716x375 RGBA PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['AK-47', 'AWP', 'Famas', 'Galil-AR', 'Glock', 'M4A1', 'M4A4', 'P-90', 'SG-553', 'UMP', 'USP'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1100 |
| valid | 275 |
|
senhorsapo/loide | ---
license: openrail
---
|
Solshine/Natural_Farming_Recipes_Datachunks | ---
license: other
tags:
- biology
- climate
pretty_name: Natural Farming Fertilizer Recipes
---
licensed under Hippocratic License HL3-CL-ECO-EXTR
[](https://firstdonoharm.dev/version/3/0/cl-eco-extr.html)
Data is in ENGLISH. NOT KOREAN LANGUAGE. Rather, it details Korean Natural Farming recipes from the Korean Natural Farming tradition of regenerative agriculture farming practices and Indigenous Knowledge System, in English.
This Dataset was constructed by Caleb DeLeeuw (Solshine) using a langchain RAG embeddings system over the famous Natural Farming Text "Dr. Cho's Global Natural Farming" to extract the key fertilizer recipes as data chunks.
The dataset is in json. |
liuyanchen1015/MULTI_VALUE_sst2_got_gotten | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: train
num_bytes: 1821
num_examples: 13
download_size: 3137
dataset_size: 1821
---
# Dataset Card for "MULTI_VALUE_sst2_got_gotten"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pradeep239/Donut_50_BalanceSheet | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 23912854.0
num_examples: 42
- name: validation
num_bytes: 2612158.0
num_examples: 5
- name: test
num_bytes: 1767437.0
num_examples: 3
download_size: 19820213
dataset_size: 28292449.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ConnorLuckettDSTG/SARFish | ---
license: apache-2.0
task_categories:
- object-detection
- image-classification
tags:
- SARFish
- Illegal Fishing
- Computer Vision
- Complex-Valued
- Synthetic Aperture Radar
pretty_name: SARFish Dataset
size_categories:
- n<1K
---
SARFish is a [Synthetic Aperture Radar (SAR)](https://sentinel.esa.int/web/sentinel/missions/sentinel-1/instrument-payload) imagery dataset for the purpose of training, validating and testing supervised machine learning models on the tasks of ship detection, classification, and length regression. The SARFish dataset builds on the excellent work of the [xView3-SAR dataset](https://iuu.xview.us/dataset) (2021) and consists of two parts:
1. Data - Extends the xView3-SAR dataset to include [Single Look Complex (SLC)](https://sentinels.copernicus.eu/web/sentinel/technical-guides/sentinel-1-sar/products-algorithms/level-1-algorithms/single-look-complex) as well as [Ground Range Detected (GRD)](https://sentinels.copernicus.eu/web/sentinel/technical-guides/sentinel-1-sar/products-algorithms/level-1-algorithms/ground-range-detected) imagery data taken directly from the European Space Agency (ESA) Copernicus Programme [Open Access Hub Website](https://scihub.copernicus.eu/).
2. Labels - Derives labels from the xView3-SAR dataset providing maritime object location, vessel classification and vessel length information.
### Quick Links
The following are links to the Kaggle competitions for each of the tracks of the SARFish challenge along with the SARFish dataset and GitHub repo:
- Data:
- [SARFish](https://huggingface.co/datasets/ConnorLuckettDSTG/SARFish)
- [SARFishSample](https://huggingface.co/datasets/ConnorLuckettDSTG/SARFishSample)
- [Labels](https://iuu.xview.us/download-links)
- Challenge:
- [Maritime Object Detection Track](https://www.kaggle.com/competitions/sarfish-maritime-object-detection)
- [Maritime Object Classification Track](https://www.kaggle.com/competitions/sarfish-maritime-object-classification)
- [Vessel Length Regression Track](https://www.kaggle.com/competitions/sarfish-vessel-length-regression)
- [GitHub repo](https://github.com/RitwikGupta/SARFish)
- [Mailbox](SARFish.Dataset@defence.gov.au)
- [DAIRNet](https://www.dairnet.com.au/events/workshop-on-complex-valued-deep-learning-and-sarfish-challenge/)
The [GitHub repo](https://github.com/RitwikGupta/SARFish) describes how to:
- Download the dataset.
- Run the SARFish_demo jupyter notebook.
- Load imagery products and groundtruth labels,
- Train and evaluate a reference/baseline model using the dataset.
### Dataset summary - What does the SARFish dataset consist of?
The following table summarises the sizes of the full size and sample SARFish dataset.
| dataset | coincident GRD, SLC products | compressed (GB) | uncompressed (GB) |
| --- | --- | --- | --- |
| SARFishSample | 1 | 4.3 | 8.2 |
| SARFish | 753 | 3293 | 6468 |
The following table summarises the partitions of the dataset:
| Partition | Coincident products | Labels Provided | Unique maritime object labels | |
| --- | --- | --- | --- | --- |
| | | | SLC | GRD |
| train | 553 | True | 63071 | 64054 |
| validation | 50 | True | 18906 | 19222 |
| public | 150 | False | 58744 | 60008 |
| | | Total | 140721 | 143284 |
### How to access the SARFish dataset
The SARFish dataset is available for download at:
- [full SARFish dataset](https://huggingface.co/datasets/ConnorLuckettDSTG/SARFish)
- [sample SARFish dataset](https://huggingface.co/datasets/ConnorLuckettDSTG/SARFishSample)
#### Full SARFish dataset
Make sure you have at least enough storage space for the uncompressed dataset.
```bash
cd /path/to/large/storage/location
```
[Create|login] to a [huggingface](https://huggingface.co) account.
Login to the huggingface command line interface.
```bash
huggingface-cli login
```
Copy the access token in settings/Access Tokens from your huggingface account. Clone the dataset
```bash
git lfs install
git clone https://huggingface.co/datasets/ConnorLuckettDSTG/SARFish
```
#### SARFish sample dataset
Substitute the final command for the full dataset with the following:
```bash
git clone https://huggingface.co/datasets/ConnorLuckettDSTG/SARFishSample
```
Follow the instructions of the github repo README to check the md5sums of the data and unzip them.
#### Labels
The SARFish dataset labels are derived from the labels supplied with the [xView-3 SAR dataset](https://iuu.xview.us/dataset). The SARFish dataset labels are available for download from the [DIU website](https://iuu.xview.us/download-links). Be sure to take into account country restrictions.
### Data
SARFish extends the xView3-SAR dataset by providing products from the [Sentinel-1 C-band SAR satellite constellation](https://sentinel.esa.int/web/sentinel/missions/sentinel-1) operated by the European Space Agency’s (ESA) Copernicus Programme available on their [Open Access Hub Website](https://scihub.copernicus.eu/) in both real-valued GRD and complex-valued SLC product types.

The above image shows a condensed summary of the image formation pipeline of the Sentinel-1 products provided by the Sentinel-1 Mission Performance Center. Note that the SLC and GRD products both share a common ancestor.

The above image shows the relationship between the xView3-SAR and SARFish datasets.
#### Summary table
The following table compares the GRD and SLC products of the SARFish dataset [3][4]
| | | |
| --- | --- | --- |
| Platform | Sentinel-1 (A, B) | |
| Operator | European Space Agency (ESA) Sentinel-1 Mission Performance Center | |
| Sensor | CBand SAR | |
| Mode | Interferometric Wide Swath (IW) | |
| Polarisations | VV, VH | |
| Ground range coverage (km) | 251.8 | |
| Product type | SLC | GRD |
| Pixel value | Complex | Magnitude Detected |
| Data type | Complex Int16 | Unsigned Int16 |
| Azimuth pixel spacing (m) | 2.3 | 10 |
| Range pixel spacing (m) | 14.1 | 10 |
#### Ground Range Detected (GRD) Products
GRD products consist of two 'detected' imagery products in VH, VV polarisations. The imagery data is stored in GeoTiff format. Also included in the dataset are no_data masks and shoreline files which are used to evaluate 'close-to-shore' maritime object detection tasks.
#### Single Look Complex (SLC) Products



The figures above show the 'swaths' comprising a SARFish SLC product in VH polarisation with groundtruth maritime object. labels The complex data has been 'detected' [3] by projecting the complex-valued data onto the real numbers for visualisation and displayed on decibel scale where the dynamic range is between 15 and 60 dB. Note that the SLC products have non-square (x, y): 2.3 × 14.1 m pixel spacing. The native format of the data is Complex Int16.

The figure above shows the footprint of the first swath of the example SLC product in context. The footprint was plotted using Clyde D'Cruz' ["openstreetmap WKT playground"](https://clydedacruz.github.io/openstreetmap-wkt-playground/).


The above images show detail of a labelled vessel in a SLC product in both VH (above) and VV (below) polarisations. Note the differences in the speckle and side-lobing artefacts on the vessel between polarisations and the non-square pixel spacing.
### Labels
#### Location labels
The labels denote the image pixel and geographic coordinate location of the maritime object.
| field | data_type | description |
| --------- | ----------- | --------- |
| detect\_lat | float | latitude of detection in World Geodetic System (WGS) 84 coordinates |
| detect\_lon | float | longitude of detection in WGS84 coordinates |
| detect\_scene\_row | int | pixel row of scene containing detection |
| detect\_scene\_column | int | pixel column of scene containing detection |
#### Classification Labels
The labels for the maritime object classification are organised in the same hierarchical structure as the xView3-SAR challenge labels:
```bash
label_heirarchy:
└── maritime_objects
└── vessels
└── fishing_vessels
```
They are denoted by the following columns in the labels:
| field | data_type | description |
| --------- | ----------- | --------- |
| is\_vessel | bool | True if detection is a vessel, False otherwise |
| is\_fishing | bool | True if detection is a fishing vessel, False otherwise |
The maritime object categories are labelled using boolean values to the following questions:
- is the maritime object a vessel?
- is the vessel a fishing vessel?
The following table shows the combinations of hierarchical classification labels present in the SARFish dataset:
| is\_vessel | is\_fishing |
|------------:|-------------:|
| False | nan |
| True | nan |
| | False |
| | True |
| nan | nan |
#### Vessel Length Labels
The vessel lengths are denoted in the following column in the labels:
| field | data_type | description |
| --------- | ----------- | --------- |
| vessel\_length\_m | float | length of vessel in meters; only provided where available from AIS |
#### Detailed labels summary
| field | data_type | description |
| --------- | ----------- | --------- |
| partition | str: \{"train", "validation"\} | split of the dataset |
| product\_type | str: \{"GRD", "SLC"\} | product type of the data |
| scene\_id | str | unique xView3 scene ID for challenge purposes |
| detect\_id | str | unique detection ID in the format: {scene\_id}\_{detect\_lat}\_{detect\_lon} |
| \{product\_type\}\_product\_identifier | str | The Copernicus Sentinel-1 product identifier for the designated product type |
| detect\_lat | float | latitude of detection in World Geodetic System (WGS) 84 coordinates |
| detect\_lon | float | longitude of detection in WGS84 coordinates |
| detect\_scene\_row | int | pixel row of scene containing detection |
| detect\_scene\_column | int | pixel column of scene containing detection |
| top | float | pixel row of the top left corner of the bounding box, where available |
| left | float | pixel column of the top left corner of the bounding box, where available |
| bottom | float | pixel row of the bottom right corner of the bounding box, where available |
| right | float | pixel column of the bottom right corner of the bounding box, where available |
| vessel\_length\_m | float | length of vessel in meters; only provided where available from AIS |
| source | str: \{AIS, AIS/Manual, Manual\} | source of detection (AIS, manual label, or both) |
| is\_vessel | bool | True if detection is a vessel, False otherwise |
| is\_fishing | bool | True if detection is a fishing vessel, False otherwise |
| global\_shoreline\_vector\_distance\_from\_shore\_km | float | distance from shore of detection in kilometers as determined using the global shoreline vectors projected into the pixel space of the SARFish products |
| xView3\_shoreline\_vector\_distance\_from\_shore\_km | float | distance from shore of detection in kilometers as determined using the xView3-SAR shoreline vectors projected into the pixel space of the SARFish products |
| confidence | str: \{HIGH, MEDIUM, LOW\} | level of confidence for is\_vessel and is\_fishing labels |
### Source
The Sentinel-1 GRD and SLC products were downloaded the University of Alaska's Alaska Satellite Facillity (ASF) which operates NASA's Distributed Active Archive Center (DAAC).
- [website](https://asf.alaska.edu/)
- [registration](https://urs.earthdata.nasa.gov/users/new)
- [download](https://datapool.asf.alaska.edu/)
- API docs
- [basics](https://docs.asf.alaska.edu/api/basics/)
- [keywords](https://docs.asf.alaska.edu/api/keywords/)
- [tools](https://docs.asf.alaska.edu/api/tools/)
[1]. Tri-Tan Cao, Connor Luckett, Jerome Williams, Tristrom Cooke, Ben Yip, Arvind Rajagopalan, and Sebastien Wong. Sarfish: Space-based maritime surveillance using complex synthetic aperture radar imagery. In 2022 International Conference on Digital Image Computing: Techniques and Applications (DICTA), pages 1–8. IEEE, 2022.
[2] xview3-sar: Detecting dark fishing activity using synthetic aperture radar imagery. arXiv:2206.00897v4 [cs.CV], Nov 2022.
[3] M. Bourbigot, H. Johnsen, R. Piantanida, and G. Hajduch, Sentinel-1 Product Definition. Sentinel-1 Mission Performance Centre, 2016. [Online]. Available: https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/document-library/-/asset_publisher/1dO7RF5fJMbd/content/sentinel-1-product-definition
[4] S. N. R. Chandra, J. Christopherson, and K. A. Casey, 2020 Joint Agency Commercial Imagery Evaluation—Remote sensing satellite compendium. US Geological Survey, 2020. |
Astral-P/ElizabethMidford | ---
license: wtfpl
---
|
juancopi81/orca-math-word-problems-150030_160032 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 6011926
num_examples: 10002
download_size: 2004071
dataset_size: 6011926
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AnkitPatil/Ellora_Caves | ---
license: other
license_name: ankit-patil
license_link: LICENSE
---
|
discofuse | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: discofuse
pretty_name: DiscoFuse
tags:
- sentence-fusion
dataset_info:
- config_name: discofuse-sport
features:
- name: connective_string
dtype: string
- name: discourse_type
dtype: string
- name: coherent_second_sentence
dtype: string
- name: has_coref_type_pronoun
dtype: float32
- name: incoherent_first_sentence
dtype: string
- name: incoherent_second_sentence
dtype: string
- name: has_coref_type_nominal
dtype: float32
- name: coherent_first_sentence
dtype: string
splits:
- name: train
num_bytes: 14736176073
num_examples: 43291020
- name: test
num_bytes: 151655243
num_examples: 445521
- name: validation
num_bytes: 150206657
num_examples: 440902
download_size: 9422142544
dataset_size: 15038037973
- config_name: discofuse-wikipedia
features:
- name: connective_string
dtype: string
- name: discourse_type
dtype: string
- name: coherent_second_sentence
dtype: string
- name: has_coref_type_pronoun
dtype: float32
- name: incoherent_first_sentence
dtype: string
- name: incoherent_second_sentence
dtype: string
- name: has_coref_type_nominal
dtype: float32
- name: coherent_first_sentence
dtype: string
splits:
- name: train
num_bytes: 6377885028
num_examples: 16310585
- name: test
num_bytes: 64007750
num_examples: 163657
- name: validation
num_bytes: 65681627
num_examples: 168081
download_size: 3929336540
dataset_size: 6507574405
configs:
- config_name: discofuse-sport
data_files:
- split: train
path: discofuse-sport/train-*
- split: test
path: discofuse-sport/test-*
- split: validation
path: discofuse-sport/validation-*
- config_name: discofuse-wikipedia
data_files:
- split: train
path: discofuse-wikipedia/train-*
- split: test
path: discofuse-wikipedia/test-*
- split: validation
path: discofuse-wikipedia/validation-*
---
# Dataset Card for "discofuse"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/google-research-datasets/discofuse
- **Paper:** [DiscoFuse: A Large-Scale Dataset for Discourse-Based Sentence Fusion](https://arxiv.org/abs/1902.10526)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 6.04 GB
- **Size of the generated dataset:** 21.55 GB
- **Total amount of disk used:** 27.59 GB
### Dataset Summary
DiscoFuse is a large scale dataset for discourse-based sentence fusion.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### discofuse-sport
- **Size of downloaded dataset files:** 4.33 GB
- **Size of the generated dataset:** 15.04 GB
- **Total amount of disk used:** 19.36 GB
An example of 'train' looks as follows.
```
{
"coherent_first_sentence": "Four LPr and three LC2000r HP Netservers handle customer management and web server functions .",
"coherent_second_sentence": "Finally , an HP Netserver LT6000r hosts i2 Demand Planner and i2 Collaboration Planner .",
"connective_string": "finally ,",
"discourse_type": "PAIR_CONN",
"has_coref_type_nominal": 0.0,
"has_coref_type_pronoun": 0.0,
"incoherent_first_sentence": "Four LPr and three LC2000r HP Netservers handle customer management and web server functions .",
"incoherent_second_sentence": "An HP Netserver LT6000r hosts i2 Demand Planner and i2 Collaboration Planner ."
}
```
#### discofuse-wikipedia
- **Size of downloaded dataset files:** 1.72 GB
- **Size of the generated dataset:** 6.51 GB
- **Total amount of disk used:** 8.23 GB
An example of 'validation' looks as follows.
```
{
"coherent_first_sentence": "Four LPr and three LC2000r HP Netservers handle customer management and web server functions .",
"coherent_second_sentence": "Finally , an HP Netserver LT6000r hosts i2 Demand Planner and i2 Collaboration Planner .",
"connective_string": "finally ,",
"discourse_type": "PAIR_CONN",
"has_coref_type_nominal": 0.0,
"has_coref_type_pronoun": 0.0,
"incoherent_first_sentence": "Four LPr and three LC2000r HP Netservers handle customer management and web server functions .",
"incoherent_second_sentence": "An HP Netserver LT6000r hosts i2 Demand Planner and i2 Collaboration Planner ."
}
```
### Data Fields
The data fields are the same among all splits.
#### discofuse-sport
- `connective_string`: a `string` feature.
- `discourse_type`: a `string` feature.
- `coherent_second_sentence`: a `string` feature.
- `has_coref_type_pronoun`: a `float32` feature.
- `incoherent_first_sentence`: a `string` feature.
- `incoherent_second_sentence`: a `string` feature.
- `has_coref_type_nominal`: a `float32` feature.
- `coherent_first_sentence`: a `string` feature.
#### discofuse-wikipedia
- `connective_string`: a `string` feature.
- `discourse_type`: a `string` feature.
- `coherent_second_sentence`: a `string` feature.
- `has_coref_type_pronoun`: a `float32` feature.
- `incoherent_first_sentence`: a `string` feature.
- `incoherent_second_sentence`: a `string` feature.
- `has_coref_type_nominal`: a `float32` feature.
- `coherent_first_sentence`: a `string` feature.
### Data Splits
| name | train |validation| test |
|-------------------|-------:|---------:|-----:|
|discofuse-sport |43291020| 440902|445521|
|discofuse-wikipedia|16310585| 168081|163657|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The data is licensed under [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/) license.
### Citation Information
```
@InProceedings{GevaEtAl2019,
title = {DiscoFuse: A Large-Scale Dataset for Discourse-Based Sentence Fusion},
author = {Geva, Mor and Malmi, Eric and Szpektor, Idan and Berant, Jonathan},
booktitle = {Proceedings of the 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics},
note = {arXiv preprint arXiv:1902.10526},
year = {2019}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun) for adding this dataset. |
SamHernandez/my-style | ---
license: afl-3.0
---
|
argilla/distilabel-sample-evol-instruct | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: instructions
sequence: string
splits:
- name: train
num_bytes: 24384
num_examples: 13
download_size: 20771
dataset_size: 24384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- distilabel
- synthetic
--- |
Falah/3d_object_drawing | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 181730
num_examples: 1000
download_size: 4528
dataset_size: 181730
---
# Dataset Card for "3d_object_drawing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OK-VQA_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 848197053.0
num_examples: 5046
download_size: 849997989
dataset_size: 848197053.0
---
# Dataset Card for "OK-VQA_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ConnorDJ/ESSInfo | ---
license: apache-2.0
---
|
WENGSYX/LMTuner-law-v1 | ---
dataset_info:
features:
- name: conversations
sequence: string
- name: source
dtype: string
- name: version
dtype: string
splits:
- name: train
num_bytes: 69131211
num_examples: 92386
download_size: 34186545
dataset_size: 69131211
---
# Dataset Card for "Lingo-law-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/prompt_f3.0_to_hermes | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 8767450
num_examples: 1960
download_size: 3252670
dataset_size: 8767450
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_234 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1145411080
num_examples: 223190
download_size: 1171860611
dataset_size: 1145411080
---
# Dataset Card for "chunk_234"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roa7n/patched_test_p_150_f_membrane_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1552272870
num_examples: 2394171
download_size: 128097844
dataset_size: 1552272870
---
# Dataset Card for "patched_test_p_150_f_membrane_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713012528 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8734
num_examples: 23
download_size: 8548
dataset_size: 8734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kastan/stormfront-small-textonly | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8544511
num_examples: 10000
- name: test
num_bytes: 352602
num_examples: 791
download_size: 5539239
dataset_size: 8897113
---
# Dataset Card for "stormfront-small-textonly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/dbscan5 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 114073960
num_examples: 1430
- name: validation
num_bytes: 114760360
num_examples: 1430
- name: test
num_bytes: 115446760
num_examples: 1430
download_size: 3208411
dataset_size: 344281080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
communityai/communityai_apt-instruct-code-micro-200k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 833031376.0805049
num_examples: 200000
download_size: 369609219
dataset_size: 833031376.0805049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dieginho/NicolasV1 | ---
license: openrail
---
|
chrislee973/whales-mel | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 1544189098.911
num_examples: 29997
download_size: 1134738496
dataset_size: 1544189098.911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GEM/RiSAWOZ | ---
annotations_creators:
- crowd-sourced
language_creators:
- unknown
language:
- zh
license:
- cc-by-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- conversational
task_ids: []
pretty_name: RiSAWOZ
tags:
- dialog-response-generation
---
# Dataset Card for GEM/RiSAWOZ
## Dataset Description
- **Homepage:** https://terryqj0107.github.io/RiSAWOZ_webpage
- **Repository:** https://github.com/terryqj0107/RiSAWOZ
- **Paper:** https://aclanthology.org/2020.emnlp-main.67
- **Leaderboard:** N/A
- **Point of Contact:** Deyi Xiong
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/RiSAWOZ).
### Dataset Summary
RiSAWOZ is a Chinese dialog dataset. It can be used to study various dialogue tasks, such as Dialogue State Tracking, Dialogue Context-to-Text Generation, Coreference Resolution and Unified Generative Ellipsis and Coreference Resolution.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/RiSAWOZ')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/RiSAWOZ).
#### website
[Website](https://terryqj0107.github.io/RiSAWOZ_webpage)
#### paper
[ACL Anthology](https://aclanthology.org/2020.emnlp-main.67)
#### authors
Jun Quan (Soochow University, Suzhou, China), Shian Zhang (Soochow University, Suzhou, China), Qian Cao(Soochow University, Suzhou, China), Zizhong Li (Tianjin University, Tianjin, China), Deyi Xiong (Tianjin University, Tianjin, China)
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Website](https://terryqj0107.github.io/RiSAWOZ_webpage)
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Github](https://github.com/terryqj0107/RiSAWOZ)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[ACL Anthology](https://aclanthology.org/2020.emnlp-main.67)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@inproceedings{quan-etal-2020-risawoz,
title = "{R}i{SAWOZ}: A Large-Scale Multi-Domain {W}izard-of-{O}z Dataset with Rich Semantic Annotations for Task-Oriented Dialogue Modeling",
author = "Quan, Jun and
Zhang, Shian and
Cao, Qian and
Li, Zizhong and
Xiong, Deyi",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-main.67",
pages = "930--940",
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Deyi Xiong
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
dyxiong@tju.edu.cn
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Dialects
<!-- info: What dialects are covered? Are there multiple dialects per language? -->
<!-- scope: periscope -->
Only Mandarin Chinese is covered in this dataset.
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`Mandarin Chinese`
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-4.0: Creative Commons Attribution 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
RiSAWOZ can be used to support the study under various dialogue tasks, such as Natural Language Understanding, Dialogue State Tracking, Dialogue Context-to-Text Generation, Coreference Resolution and Unified Generative Ellipsis and Coreference Resolution.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Dialog Response Generation
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
Generate system response given dialogue context across multiple domains.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
Soochow University and Tianjin University
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Jun Quan (Soochow University, Suzhou, China), Shian Zhang (Soochow University, Suzhou, China), Qian Cao(Soochow University, Suzhou, China), Zizhong Li (Tianjin University, Tianjin, China), Deyi Xiong (Tianjin University, Tianjin, China)
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
the National Key Research and Development Project
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Tianhao Shen (Tianjin University, Tianjin, China), Chaobin You (Tianjin University, Tianjin, China), Deyi Xiong (Tianjin University, Tianjin, China)
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
- gem_id (string): GEM-RiSAWOZ-{split}-{id}
- dialogue_id (string): dialogue ID
- goal (string): natural language descriptions of the user goal
- domains (list of strings): domains mentioned in current dialogue session
- dialogue (list of dicts): dialog turns and corresponding annotations. Each turn includes:
- turn_id (int): turn ID
- turn_domain (list of strings): domain mentioned in current turn
- user_utterance (string): user utterance
- system_utterance (string): system utterance
- belief_state (dict): dialogue state, including:
- inform slot-values (dict): the slots and corresponding values informed until current turn
- turn_inform (dict): the slots and corresponding values informed in current turn
- turn request (dict): the slots requested in current turn
- user_actions (list of lists): user dialogue acts in current turn
- user_actions (list of lists): system dialogue acts in current turn
- db_results (list of strings): database search results
- segmented_user_utterance (string): word segmentation result of user utterance
- segmented_system_utterance (string): word segmentation result of system utterance
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
[
{
"dialogue_id": "attraction_goal_4-63###6177",
"goal": "attraction_goal_4-63: 你是苏州人,但不怎么出去玩。你朋友来苏州找你,你准备带他逛逛“水乡古镇”,你希望客服给你推荐个消费水平“中等”的地方。然后你要问清楚这地方“是否地铁直达”、“特点”、“门票价格”这些信息。最后,你要感谢客服的帮助,然后说再见。",
"domains": [
"旅游景点"
],
"dialogue": [
{
"turn_id": 0,
"turn_domain": [
"旅游景点"
],
"user_utterance": "你好,我是苏州人,但是不怎么出去玩,我朋友来苏州找我了,我准备带他逛逛水乡古镇,你能帮我推荐一下吗?",
"system_utterance": "推荐您去周庄古镇。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇"
},
"turn_inform": {
"旅游景点-景点类型": "水乡 古镇"
},
"turn request": []
},
"user_actions": [
[
"Inform",
"旅游景点",
"景点类型",
"水乡 古镇"
],
[
"Greeting",
"旅游景点",
"",
""
]
],
"system_actions": [
[
"Recommend",
"旅游景点",
"名称",
"周庄 古镇"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为8",
"{'名称': '周庄古镇', '区域': '昆山', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '100元', '电话号码': '400-8282900', '地址': '苏州市昆山市周庄镇全福路43号', '评分': 4.5, '开放时间': '07:30-18:50', '特点': '小桥流水与人家,双桥水巷摇橹船,还有沈万三的足迹待你寻访'}",
"{'名称': '同里古镇', '区域': '吴江', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '100元', '电话号码': '0512-63325728', '地址': '苏州市吴江区同里古镇', '评分': 4.5, '开放时间': '07:30-17:30', '特点': '五湖环抱的江南水乡古镇,走三桥,参观小巧精致的园林退思园。'}",
"{'名称': '山塘街', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-65314467', '地址': '苏州市姑苏区古城西北', '评分': 4.5, '开放时间': '08:30-21:00', '特点': '欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。'}",
"{'名称': '平江路', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-69163907', '地址': '江苏省苏州市姑苏区白塔东路65号', '评分': 4.5, '开放时间': '全天', '特点': '漫步保存完整的古街区,看小桥流水古宅,逛文艺小店,吃美食。'}",
"{'名称': '木渎古镇', '区域': '吴中区', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '便宜', '是否地铁直达': '否', '门票价格': '免费', '电话号码': '0512-66514042', '地址': '苏州市吴中区木渎镇山塘街188号', '评分': 4.4, '开放时间': '08:30-17:00', '特点': '以园林为特色的古镇,游严家花园等古典园林,坐船看小桥流水。'}",
"{'名称': '甪直古镇', '区域': '吴中区', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '便宜', '是否地铁直达': '否', '门票价格': '免费', '电话号码': '0512-66191668', '地址': '苏州市吴中区甪直镇晓市路21号', '评分': 4.3, '开放时间': '07:30-17:30', '特点': '甪直古镇有2500多年历史,甪直境内水流纵横,桥梁密布,有五湖之厅、六泽之冲之称。'}",
"{'名称': '千灯古镇', '区域': '昆山', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '便宜', '是否地铁直达': '否', '门票价格': '免费', '电话号码': '0512-57472155', '地址': '苏州市昆山市千灯古镇尚书路1号', '评分': 4.3, '开放时间': '08:00-17:00', '特点': '千灯古镇,距今已有2500多年的历史,古镇白墙黑瓦,昆韵盎然。'}",
"{'名称': '锦溪古镇', '区域': '昆山', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '中等', '是否地铁直达': '否', '门票价格': '65元', '电话号码': '0512-57224669', '地址': '苏州市昆山市锦溪镇邵甸港路18号', '评分': 4.4, '开放时间': '08:00-17:00', '特点': '锦溪古镇位于昆山南郊的淀山湖畔,是一座有千年历史的江南水乡。'}"
],
"segmented_user_utterance": "你好 , 我 是 苏州人 , 但是 不怎么 出去玩 , 我 朋友 来 苏州 找 我 了 , 我 准备 带 他 逛逛 水乡 古镇 , 你 能 帮 我 推荐 一下 吗 ?",
"segmented_system_utterance": "推荐 您 去 周庄 古镇 。"
},
{
"turn_id": 1,
"turn_domain": [
"旅游景点"
],
"user_utterance": "我希望这个地方消费水平适中的,这个是中等的吗?",
"system_utterance": "这个不是,我又重新帮您找了一家山塘街。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇",
"旅游景点-消费": "中等"
},
"turn_inform": {
"旅游景点-消费": "中等"
},
"turn request": []
},
"user_actions": [
[
"Inform",
"旅游景点",
"消费",
"中等"
]
],
"system_actions": [
[
"Recommend",
"旅游景点",
"名称",
"山塘 街"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为3",
"{'名称': '山塘街', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-65314467', '地址': '苏州市姑苏区古城西北', '评分': 4.5, '开放时间': '08:30-21:00', '特点': '欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。'}",
"{'名称': '平江路', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-69163907', '地址': '江苏省苏州市姑苏区白塔东路65号', '评分': 4.5, '开放时间': '全天', '特点': '漫步保存完整的古街区,看小桥流水古宅,逛文艺小店,吃美食。'}",
"{'名称': '锦溪古镇', '区域': '昆山', '景点类型': '水乡古镇', '最适合人群': '朋友出游', '消费': '中等', '是否地铁直达': '否', '门票价格': '65元', '电话号码': '0512-57224669', '地址': '苏州市昆山市锦溪镇邵甸港路18号', '评分': 4.4, '开放时间': '08:00-17:00', '特点': '锦溪古镇位于昆山南郊的淀山湖畔,是一座有千年历史的江南水乡。'}"
],
"segmented_user_utterance": "我 希望 这个 地方 消费水平 适中 的 , 这个 是 中等 的 吗 ?",
"segmented_system_utterance": "这个 不是 , 我 又 重新 帮 您 找 了 一家 山塘 街 。"
},
{
"turn_id": 2,
"turn_domain": [
"旅游景点"
],
"user_utterance": "听这地方应该挺好玩,他们家特点是什么?",
"system_utterance": "特点是欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇",
"旅游景点-消费": "中等",
"旅游景点-名称": "山塘 街"
},
"turn_inform": {
"旅游景点-名称": "山塘 街"
},
"turn request": [
"特点"
]
},
"user_actions": [
[
"Inform",
"旅游景点",
"名称",
"山塘 街"
],
[
"Request",
"旅游景点",
"特点",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"特点",
"欣赏 苏州 老城 风貌 , 参观 会馆 等 古迹 , 乘 游船 , 拍摄 美丽 的 夜景"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '山塘街', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-65314467', '地址': '苏州市姑苏区古城西北', '评分': 4.5, '开放时间': '08:30-21:00', '特点': '欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。'}"
],
"segmented_user_utterance": "听 这 地方 应该 挺好玩 , 他们 家 特点 是 什么 ?",
"segmented_system_utterance": "特点 是 欣赏 苏州 老城 风貌 , 参观 会馆 等 古迹 , 乘 游船 , 拍摄 美丽 的 夜景 。"
},
{
"turn_id": 3,
"turn_domain": [
"旅游景点"
],
"user_utterance": "听起来很不错,他们家门票是多少?",
"system_utterance": "门票免费。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇",
"旅游景点-消费": "中等",
"旅游景点-名称": "山塘 街"
},
"turn_inform": {},
"turn request": [
"门票价格"
]
},
"user_actions": [
[
"Request",
"旅游景点",
"门票价格",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"门票价格",
"免费"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '山塘街', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-65314467', '地址': '苏州市姑苏区古城西北', '评分': 4.5, '开放时间': '08:30-21:00', '特点': '欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。'}"
],
"segmented_user_utterance": "听 起来 很 不错 , 他们 家 门票 是 多少 ?",
"segmented_system_utterance": "门票 免费 。"
},
{
"turn_id": 4,
"turn_domain": [
"旅游景点"
],
"user_utterance": "还行,那他们家是否有地铁直达呢?",
"system_utterance": "嗯嗯,可以地铁直达。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇",
"旅游景点-消费": "中等",
"旅游景点-名称": "山塘 街"
},
"turn_inform": {},
"turn request": [
"是否地铁直达"
]
},
"user_actions": [
[
"Request",
"旅游景点",
"是否地铁直达",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"是否地铁直达",
"是"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '山塘街', '区域': '姑苏区', '景点类型': '水乡古镇', '最适合人群': '情侣约会', '消费': '中等', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '0512-65314467', '地址': '苏州市姑苏区古城西北', '评分': 4.5, '开放时间': '08:30-21:00', '特点': '欣赏苏州老城风貌,参观会馆等古迹,乘游船,拍摄美丽的夜景。'}"
],
"segmented_user_utterance": "还 行 , 那 他们 家 是否 有 地铁 直达 呢 ?",
"segmented_system_utterance": "嗯 嗯 , 可以 地铁 直达 。"
},
{
"turn_id": 5,
"turn_domain": [
"通用"
],
"user_utterance": "谢谢你的帮助,再见!",
"system_utterance": "再见。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "水乡 古镇",
"旅游景点-消费": "中等",
"旅游景点-名称": "山塘 街"
},
"turn_inform": {},
"turn request": []
},
"user_actions": [
[
"Bye",
"通用",
"",
""
]
],
"system_actions": [
[
"Bye",
"通用",
"",
""
]
],
"db_results": [],
"segmented_user_utterance": "谢谢 你 的 帮助 , 再见 !",
"segmented_system_utterance": "再见 。"
}
]
},
{
"dialogue_id": "attraction_goal_3-33_v2###7307",
"goal": "attraction_goal_3-33: 你和朋友一起来苏州出差,顺便想去“苏州乐园森林水世界”逛一逛。因此,你要询问客服这个地方“是否地铁直达”、“评分”、“特点”这些信息。最后,你向客服表示感谢,结束对话。",
"domains": [
"旅游景点"
],
"dialogue": [
{
"turn_id": 0,
"turn_domain": [
"旅游景点"
],
"user_utterance": "你好,我和朋友一起来苏州出差,顺便想去苏州乐园森林水世界玩玩,能否帮我看看这个地方是否有地铁直达呢?",
"system_utterance": "稍等,很抱歉,没有地铁能直接到这个景点呢。",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn request": [
"是否地铁直达"
]
},
"user_actions": [
[
"Inform",
"旅游景点",
"名称",
"苏州 乐园 森林 水 世界"
],
[
"Request",
"旅游景点",
"是否地铁直达",
""
],
[
"Greeting",
"旅游景点",
"",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"是否地铁直达",
"否"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '苏州乐园森林水世界', '区域': '高新区', '景点类型': '游乐场', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '198元', '电话号码': '0512-68717107', '地址': '苏州市高新区阳山东路以西,山神湾路以东', '评分': 4.4, '开放时间': '09:00-17:00', '特点': '各项水上娱乐项目皆具森林特色,被赋予森林主题,让人身处浩瀚森林又能享受到海洋般的舒爽水游乐。'}"
],
"segmented_user_utterance": "你好 , 我 和 朋友 一 起来 苏州 出差 , 顺便 想 去 苏州 乐园 森林 水 世界 玩玩 , 能否 帮 我 看看 这个 地方 是否 有 地铁 直达 呢 ?",
"segmented_system_utterance": "稍 等 , 很 抱歉 , 没有 地铁 能 直接 到 这个 景点 呢 。"
},
{
"turn_id": 1,
"turn_domain": [
"旅游景点"
],
"user_utterance": "你再帮我看看景点的评分好吗?",
"system_utterance": "这个景点的评分是4.4分。",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {},
"turn request": [
"评分"
]
},
"user_actions": [
[
"Request",
"旅游景点",
"评分",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"评分",
"4.4"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '苏州乐园森林水世界', '区域': '高新区', '景点类型': '游乐场', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '198元', '电话号码': '0512-68717107', '地址': '苏州市高新区阳山东路以西,山神湾路以东', '评分': 4.4, '开放时间': '09:00-17:00', '特点': '各项水上娱乐项目皆具森林特色,被赋予森林主题,让人身处浩瀚森林又能享受到海洋般的舒爽水游乐。'}"
],
"segmented_user_utterance": "你 再 帮 我 看看 景点 的 评分 好 吗 ?",
"segmented_system_utterance": "这个 景点 的 评分 是 4.4 分 。"
},
{
"turn_id": 2,
"turn_domain": [
"旅游景点"
],
"user_utterance": "这个景点有啥特点呢?",
"system_utterance": "这个景点的特点是各项水上娱乐项目皆具森林特色,被赋予森林主题,让人身处浩瀚森林又能享受到海洋般的舒爽水游乐。",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {},
"turn request": [
"特点"
]
},
"user_actions": [
[
"Request",
"旅游景点",
"特点",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"特点",
"各项 水上 娱乐 项目 皆 具 森林 特色 , 被 赋予 森林 主题 , 让 人 身处 浩瀚 森林 又 能 享受 到 海洋 般的 舒爽 水 游乐 。"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '苏州乐园森林水世界', '区域': '高新区', '景点类型': '游乐场', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '198元', '电话号码': '0512-68717107', '地址': '苏州市高新区阳山东路以西,山神湾路以东', '评分': 4.4, '开放时间': '09:00-17:00', '特点': '各项水上娱乐项目皆具森林特色,被赋予森林主题,让人身处浩瀚森林又能享受到海洋般的舒爽水游乐。'}"
],
"segmented_user_utterance": "这个 景点 有 啥 特点 呢 ?",
"segmented_system_utterance": "这个 景点 的 特点 是 各项 水上 娱乐 项目 皆 具 森林 特色 , 被 赋予 森林 主题 , 让 人 身处 浩瀚 森林 又 能 享受 到 海洋 般的 舒爽 水 游乐 。"
},
{
"turn_id": 3,
"turn_domain": [
"通用"
],
"user_utterance": "太好了,我觉得我会很喜欢这个地方,哈哈。",
"system_utterance": "这个景点挺不错的,就是没有地铁需要您乘坐其它交通工具了。",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {},
"turn request": []
},
"user_actions": [
[
"General",
"通用",
"",
""
]
],
"system_actions": [
[
"General",
"通用",
"",
""
]
],
"db_results": [],
"segmented_user_utterance": "太好了 , 我 觉得 我会 很 喜欢 这个 地方 , 哈哈 。",
"segmented_system_utterance": "这个 景点 挺不错 的 , 就是 没有 地铁 需要 您 乘坐 其它 交通工具 了 。"
},
{
"turn_id": 4,
"turn_domain": [
"通用"
],
"user_utterance": "我打出租车去吧,就是多花点车费,没事。",
"system_utterance": "请问您还有其他问题吗?",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {},
"turn request": []
},
"user_actions": [
[
"General",
"通用",
"",
""
]
],
"system_actions": [
[
"General",
"通用",
"",
""
]
],
"db_results": [],
"segmented_user_utterance": "我 打 出租车 去 吧 , 就是 多花点 车费 , 没事 。",
"segmented_system_utterance": "请问 您 还有 其他 问题 吗 ?"
},
{
"turn_id": 5,
"turn_domain": [
"通用"
],
"user_utterance": "没有了。谢谢你,再见。",
"system_utterance": "再见。",
"belief_state": {
"inform slot-values": {
"旅游景点-名称": "苏州 乐园 森林 水 世界"
},
"turn_inform": {},
"turn request": []
},
"user_actions": [
[
"Bye",
"通用",
"",
""
]
],
"system_actions": [
[
"Bye",
"通用",
"",
""
]
],
"db_results": [],
"segmented_user_utterance": "没有 了 。 谢谢 你 , 再见 。",
"segmented_system_utterance": "再见 。"
}
]
}
]
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
- Train:
- dialogues: 10000
- turns: 134580
- Avg. turns: 13.5
- Valid:
- dialogues: 600
- turns: 8116
- Avg. turns: 13.5
- Test:
- dialogues: 600
- turn: 9286
- Avg. turns: 15.5
####
<!-- info: What does an outlier of the dataset in terms of length/perplexity/embedding look like? -->
<!-- scope: microscope -->
the most complex example (i.e. with the most domains):
```
{
"dialogue_id": "attraction_restaurant_hotel_goal_2-69_v2###8355",
"goal": "attraction_restaurant_hotel_goal_2-69: 公司派你去苏州出差,你准备抽空旅游一下。第一,你想去一个“山水景区”,消费水平最好是“偏贵”的,你需要问清楚这地方“开放时间”、“门票价格”这些信息。第二,你要找个地方住,让客服给你推荐一家价位“中等”的“商务出行”酒店,并且询问这家酒店的“停车场”、“电话号码”这些信息。第三,你还想拔草美食,问客服有没有价位“中等”的“江浙菜”餐厅推荐,然后了解这家餐厅的“是否地铁直达”、“人均消费”这些信息。最后你要感谢客服的帮助,并说再见。",
"domains": [
"旅游景点",
"餐厅",
"酒店"
],
"dialogue": [
{
"turn_id": 0,
"turn_domain": [
"旅游景点"
],
"user_utterance": "你好,来苏州出差,准备抽空旅游一下。我想去山水景区,消费水平偏贵点,帮忙推荐一个。",
"system_utterance": "你好,向您推荐金鸡湖景区,感受一下苏州现代化的一面。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵"
},
"turn_inform": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵"
},
"turn request": []
},
"user_actions": [
[
"Inform",
"旅游景点",
"景点类型",
"山水 景区"
],
[
"Inform",
"旅游景点",
"消费",
"偏贵"
],
[
"Greeting",
"旅游景点",
"",
""
],
[
"General",
"旅游景点",
"",
""
]
],
"system_actions": [
[
"Greeting",
"通用",
"",
""
],
[
"General",
"通用",
"",
""
],
[
"Recommend",
"旅游景点",
"名称",
"金鸡湖 景区"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为2",
"{'名称': '金鸡湖景区', '区域': '工业园区', '景点类型': '山水景区', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '400-7558558', '地址': '苏州市工业园区星港街158号', '评分': 4.5, '开放时间': '全天', '特点': '看东方之门等高楼,坐摩天轮,乘船夜游,感受苏州现代化的一面。'}",
"{'名称': '沙家浜风景区', '区域': '常熟', '景点类型': '山水景区', '最适合人群': '家庭亲子', '消费': '偏贵', '是否地铁直达': '否', '门票价格': '110元', '电话号码': '0512-52500000', '地址': '苏州市常熟市沙家浜镇', '评分': 4.5, '开放时间': '08:00-16:30', '特点': '京剧样板戏《沙家浜》的发源地,访剧中场景,坐船深入芦苇荡。'}"
],
"segmented_user_utterance": "你好 , 来 苏州 出差 , 准备 抽空 旅游 一下 。 我 想 去 山水 景区 , 消费水平 偏 贵点 , 帮忙 推荐 一个 。",
"segmented_system_utterance": "你好 , 向 您 推荐 金鸡湖 景区 , 感受一下 苏州 现代化 的 一面 。"
},
{
"turn_id": 1,
"turn_domain": [
"旅游景点"
],
"user_utterance": "这里开放时间?需要安排一下。",
"system_utterance": "景区是全天开放的,可以尽情游览!",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区"
},
"turn_inform": {
"旅游景点-名称": "金鸡湖 景区"
},
"turn request": [
"开放时间"
]
},
"user_actions": [
[
"Inform",
"旅游景点",
"名称",
"金鸡湖 景区"
],
[
"Request",
"旅游景点",
"开放时间",
""
],
[
"General",
"旅游景点",
"",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"开放时间",
"全天"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '金鸡湖景区', '区域': '工业园区', '景点类型': '山水景区', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '400-7558558', '地址': '苏州市工业园区星港街158号', '评分': 4.5, '开放时间': '全天', '特点': '看东方之门等高楼,坐摩天轮,乘船夜游,感受苏州现代化的一面。'}"
],
"segmented_user_utterance": "这里 开放 时间 ? 需要 安排 一下 。",
"segmented_system_utterance": "景区 是 全天 开放 的 , 可以 尽情 游览 !"
},
{
"turn_id": 2,
"turn_domain": [
"旅游景点"
],
"user_utterance": "门票多少钱?",
"system_utterance": "这里是免费景区。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区"
},
"turn_inform": {},
"turn request": [
"门票价格"
]
},
"user_actions": [
[
"Request",
"旅游景点",
"门票价格",
""
]
],
"system_actions": [
[
"Inform",
"旅游景点",
"门票价格",
"免费"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '金鸡湖景区', '区域': '工业园区', '景点类型': '山水景区', '最适合人群': '情侣约会', '消费': '偏贵', '是否地铁直达': '是', '门票价格': '免费', '电话号码': '400-7558558', '地址': '苏州市工业园区星港街158号', '评分': 4.5, '开放时间': '全天', '特点': '看东方之门等高楼,坐摩天轮,乘船夜游,感受苏州现代化的一面。'}"
],
"segmented_user_utterance": "门票 多少 钱 ?",
"segmented_system_utterance": "这里 是 免费 景区 。"
},
{
"turn_id": 3,
"turn_domain": [
"酒店"
],
"user_utterance": "哦,好的。还有酒店,帮我推荐个价位中等的商务酒店!",
"system_utterance": "推荐苏州金鸡湖李公堤瑞贝庭公寓酒店,在金鸡湖景区附近。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行"
},
"turn_inform": {
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行"
},
"turn request": []
},
"user_actions": [
[
"Inform",
"酒店",
"价位",
"中等"
],
[
"Inform",
"酒店",
"酒店类型",
"商务 出行"
],
[
"General",
"酒店",
"",
""
]
],
"system_actions": [
[
"Recommend",
"酒店",
"名称",
"苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
],
[
"General",
"通用",
"",
""
]
],
"db_results": [
"数据库检索结果:成功匹配个数为16",
"{'名称': '苏州慢享主题酒店', '区域': '姑苏区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '收费', '房费': '266元', '地址': '苏州姑苏区景德路26-64号', '电话号码': '0512-67570999', '评分': 4.7}",
"{'名称': '苏州慢享主题酒店', '区域': '姑苏区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '收费', '房费': '278元', '地址': '苏州姑苏区景德路26-64号', '电话号码': '0512-67570999', '评分': 4.7}",
"{'名称': '美锦酒店', '区域': '高新区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '308元', '地址': '苏州高新区滨河路999号花样年喜年生活广场5栋1层', '电话号码': '0512-66053331', '评分': 4.8}",
"{'名称': '美锦酒店', '区域': '高新区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '349元', '地址': '苏州高新区滨河路999号花样年喜年生活广场5栋1层', '电话号码': '0512-66053331', '评分': 4.8}",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}",
"{'名称': '苏州途乐酒店公寓', '区域': '工业园区', '星级': '2', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '收费', '房费': '486元', '地址': '苏州工业园区苏州丰隆城市中心T1楼', '电话号码': '151-5149-7911', '评分': 4.6}",
"{'名称': '苏州途乐酒店公寓', '区域': '工业园区', '星级': '2', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '收费', '房费': '486元', '地址': '苏州工业园区苏州丰隆城市中心T1楼', '电话号码': '151-5149-7911', '评分': 4.6}",
"{'名称': '万悦酒店', '区域': '吴中区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '346元', '地址': '苏州吴中区金山路47-2号', '电话号码': '0512-83808380', '评分': 4.5}",
"{'名称': '万悦酒店', '区域': '吴中区', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '338元', '地址': '苏州吴中区金山路47-2号', '电话号码': '0512-83808380', '评分': 4.5}",
"{'名称': '周庄多瓦台临河客栈', '区域': '昆山', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '收费', '房费': '279元', '地址': '昆山周庄镇东浜村75号', '电话号码': '181-3619-1632', '评分': 4.8}",
"{'名称': '周庄多瓦台临河客栈', '区域': '昆山', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '收费', '房费': '279元', '地址': '昆山周庄镇东浜村75号', '电话号码': '181-3619-1632', '评分': 4.8}",
"{'名称': '常熟长江路亚朵酒店', '区域': '常熟', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '312元', '地址': '常熟长江路268号', '电话号码': '0512-51661666', '评分': 4.8}",
"{'名称': '常熟长江路亚朵酒店', '区域': '常熟', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '312元', '地址': '常熟长江路268号', '电话号码': '0512-51661666', '评分': 4.8}",
"{'名称': '维也纳酒店', '区域': '常熟', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '322元', '地址': '常熟碧溪镇金港路8号', '电话号码': '0512-52018188', '评分': 4.8}",
"{'名称': '维也纳酒店', '区域': '常熟', '星级': '3', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '340元', '地址': '常熟碧溪镇金港路8号', '电话号码': '0512-52018188', '评分': 4.8}"
],
"segmented_user_utterance": "哦 , 好 的 。 还有 酒店 , 帮 我 推荐 个 价位 中等 的 商务酒店 !",
"segmented_system_utterance": "推荐 苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店 , 在 金鸡湖 景区 附近 。"
},
{
"turn_id": 4,
"turn_domain": [
"酒店"
],
"user_utterance": "你想得真周到!这家酒店有停车场吗?",
"system_utterance": "酒店配备有一个免费的停车场。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn request": [
"停车场"
]
},
"user_actions": [
[
"Inform",
"酒店",
"名称",
"苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
],
[
"Request",
"酒店",
"停车场",
""
]
],
"system_actions": [
[
"Inform",
"酒店",
"停车场",
"免费"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为2",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}"
],
"segmented_user_utterance": "你 想 得 真 周到 ! 这家 酒店 有 停车场 吗 ?",
"segmented_system_utterance": "酒店 配备 有 一个 免费 的 停车场 。"
},
{
"turn_id": 5,
"turn_domain": [
"酒店"
],
"user_utterance": "酒店的电话告诉我一下。",
"system_utterance": "电话号码:0512-69995666。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {},
"turn request": [
"电话号码"
]
},
"user_actions": [
[
"Request",
"酒店",
"电话号码",
""
]
],
"system_actions": [
[
"Inform",
"酒店",
"电话号码",
"0512 - 69995666"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为2",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '大床房', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}",
"{'名称': '苏州金鸡湖李公堤瑞贝庭公寓酒店', '区域': '工业园区', '星级': '4', '价位': '中等', '酒店类型': '商务出行', '房型': '标准间', '停车场': '免费', '房费': '438元', '地址': '苏州工业园区李公堤三期E区商业街9幢', '电话号码': '0512-69995666', '评分': 4.6}"
],
"segmented_user_utterance": "酒店 的 电话 告诉 我 一下 。",
"segmented_system_utterance": "电话号码 : 0512 - 69995666 。"
},
{
"turn_id": 6,
"turn_domain": [
"餐厅"
],
"user_utterance": "嗯,我还想拔草美食。有没有价位中等的江浙菜餐厅推荐一个。",
"system_utterance": "推荐老枫庭。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"餐厅-菜系": "江浙 菜",
"餐厅-价位": "中等",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {
"餐厅-菜系": "江浙 菜",
"餐厅-价位": "中等"
},
"turn request": []
},
"user_actions": [
[
"Inform",
"餐厅",
"菜系",
"江浙 菜"
],
[
"Inform",
"餐厅",
"价位",
"中等"
],
[
"General",
"餐厅",
"",
""
]
],
"system_actions": [
[
"Recommend",
"餐厅",
"名称",
"老枫庭"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为3",
"{'名称': '老枫庭', '区域': '吴中区', '菜系': '江浙菜', '价位': '中等', '是否地铁直达': '否', '人均消费': '94元', '地址': '木渎灵天路55号芭提雅休闲娱乐广场5号楼', '电话号码': '0512-66277557', '评分': 4.6, '营业时间': '周一至周日 11:00-13:30 17:00-20:30', '推荐菜': '松鼠桂鱼'}",
"{'名称': '同里老家', '区域': '吴江', '菜系': '江浙菜', '价位': '中等', '是否地铁直达': '否', '人均消费': '87元', '地址': '同里镇环湖西路318号', '电话号码': '0512-63339777', '评分': 4.5, '营业时间': '周一至周日 10:00-22:00', '推荐菜': '太湖三白'}",
"{'名称': '周庄花间堂桔梗餐厅', '区域': '昆山', '菜系': '江浙菜', '价位': '中等', '是否地铁直达': '否', '人均消费': '98元', '地址': '周庄镇中市街110号景点迷楼北10米', '电话号码': '138-1289-9961', '评分': 4.5, '营业时间': '周一至周日 07:00-20:30', '推荐菜': '银鱼炒蛋'}"
],
"segmented_user_utterance": "嗯 , 我 还 想 拔草 美食 。 有没有 价位 中等 的 江浙 菜 餐厅 推荐 一个 。",
"segmented_system_utterance": "推荐 老枫庭 。"
},
{
"turn_id": 7,
"turn_domain": [
"餐厅"
],
"user_utterance": "这家餐厅地铁直达吗?",
"system_utterance": "抱歉,餐厅没有地铁直达!",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"餐厅-菜系": "江浙 菜",
"餐厅-价位": "中等",
"餐厅-名称": "老枫庭",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {
"餐厅-名称": "老枫庭"
},
"turn request": [
"是否地铁直达"
]
},
"user_actions": [
[
"Inform",
"餐厅",
"名称",
"老枫庭"
],
[
"Request",
"餐厅",
"是否地铁直达",
""
]
],
"system_actions": [
[
"Inform",
"餐厅",
"是否地铁直达",
"否"
],
[
"General",
"通用",
"",
""
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '老枫庭', '区域': '吴中区', '菜系': '江浙菜', '价位': '中等', '是否地铁直达': '否', '人均消费': '94元', '地址': '木渎灵天路55号芭提雅休闲娱乐广场5号楼', '电话号码': '0512-66277557', '评分': 4.6, '营业时间': '周一至周日 11:00-13:30 17:00-20:30', '推荐菜': '松鼠桂鱼'}"
],
"segmented_user_utterance": "这家 餐厅 地铁 直达 吗 ?",
"segmented_system_utterance": "抱歉 , 餐厅 没有 地铁 直达 !"
},
{
"turn_id": 8,
"turn_domain": [
"餐厅"
],
"user_utterance": "人均消费呢?",
"system_utterance": "老枫庭人均消费在94元左右。",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"餐厅-菜系": "江浙 菜",
"餐厅-价位": "中等",
"餐厅-名称": "老枫庭",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {},
"turn request": [
"人均消费"
]
},
"user_actions": [
[
"Request",
"餐厅",
"人均消费",
""
]
],
"system_actions": [
[
"Inform",
"餐厅",
"人均消费",
"老枫庭"
]
],
"db_results": [
"数据库检索结果:成功匹配个数为1",
"{'名称': '老枫庭', '区域': '吴中区', '菜系': '江浙菜', '价位': '中等', '是否地铁直达': '否', '人均消费': '94元', '地址': '木渎灵天路55号芭提雅休闲娱乐广场5号楼', '电话号码': '0512-66277557', '评分': 4.6, '营业时间': '周一至周日 11:00-13:30 17:00-20:30', '推荐菜': '松鼠桂鱼'}"
],
"segmented_user_utterance": "人均 消费 呢 ?",
"segmented_system_utterance": "老枫庭 人均 消费 在 94 元 左右 。"
},
{
"turn_id": 9,
"turn_domain": [
"通用"
],
"user_utterance": "好,所有问题都解决了,谢谢你的帮助!再见!",
"system_utterance": "不客气!这是我应该做的!再见!",
"belief_state": {
"inform slot-values": {
"旅游景点-景点类型": "山水 景区",
"旅游景点-消费": "偏贵",
"旅游景点-名称": "金鸡湖 景区",
"餐厅-菜系": "江浙 菜",
"餐厅-价位": "中等",
"餐厅-名称": "老枫庭",
"酒店-价位": "中等",
"酒店-酒店类型": "商务 出行",
"酒店-名称": "苏州 金鸡湖 李公堤 瑞贝庭 公寓 酒店"
},
"turn_inform": {},
"turn request": []
},
"user_actions": [
[
"Bye",
"通用",
"",
""
],
[
"General",
"通用",
"",
""
]
],
"system_actions": [
[
"Bye",
"通用",
"",
""
],
[
"General",
"通用",
"",
""
]
],
"db_results": [],
"segmented_user_utterance": "好 , 所有 问题 都 解决 了 , 谢谢 你 的 帮助 ! 再见 !",
"segmented_system_utterance": "不 客气 ! 这 是 我 应该 做 的 ! 再见 !"
}
]
}
```
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
RiSAWOZ is the first large-scale multi-domain Chinese Wizard-of-Oz dataset with rich semantic annotations.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
The corpus contains rich semantic annotations, such as ellipsis and coreference, in addition to traditional dialogue annotations (dialogue states, dialogue acts, etc.), which can be used in various tasks in dialogue system.
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
Natural Language Understanding, Dialogue State Tracking, Dialogue Context-to-Text Generation, Coreference Resolution, Unified Generative Ellipsis and Coreference Resolution
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
no
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
no
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
[Website](https://terryqj0107.github.io/RiSAWOZ_webpage)
#### Technical Terms
<!-- info: Technical terms used in this card and the dataset and their definitions -->
<!-- scope: microscope -->
- In task-oriented dialogue system, the Natural Language Understanding (NLU) module aims to convert the user utterance into the representation that computer can understand, which includes intent and dialogue act (slot & value) detection.
- Dialogue State Tracking (DST) is a core component in task-oriented dialogue systems, which extracts dialogue states (user goals) embedded in dialogue context. It has progressed toward open-vocabulary or generation-based DST where state-of-the-art models can generate dialogue states from dialogue context directly.
- Context-to-Text Generation: encoding dialogue context to decode system response.
- Coreference Resolution: predict coreference clusters where all mentions are referring to the same entity for each dialogue.
- Unified Generative Ellipsis and Coreference Resolution: generating omitted or referred expressions from the dialogue context.
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
Natural Language Understanding, Dialogue State Tracking, Dialogue Context-to-Text Generation, Coreference Resolution, Unified Generative Ellipsis and Coreference Resolution
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`Other: Other Metrics`
#### Other Metrics
<!-- info: Definitions of other metrics -->
<!-- scope: periscope -->
- Natural Language Understanding:
- F1 score: F1 score of user intent.
- Dialogue State Tracking:
- Joint Accuracy: accuracy of turn-level dialogue states.
- Dialogue Context-to-Text Generation:
- inform rate: measures the percentage that the output contains the appropriate entity the user asks for.
- success rate: estimates the proportion that all the requested attributes have been answered.
- BLEU: the BLEU score of generated system response.
- Combined Score: (inform + success) ∗ 0.5 + BLEU as an overall
quality.
- Coreference Resolution:
- MUC F1 Score: a link-based metric. Mentions in the same entity/cluster are considered “linked”. MUC penalizes the missing links and incorrect links, each with the same weight.
- B3 F1 Score: a mention-based metric.The evaluation score depends on the fraction of the correct mentions included in the response entities (i.e. entities created by the system).
- CEAFφ4 F1 Score: a metric which assumes each key entity should only be mapped to one response entity, and vice versa. It aligns the key entities (clusters) with the response entities in the best way, and compute scores from that alignment.
- Average F1 Score: an average F1 score of the above three metrics.
- Unified Generative Ellipsis and Coreference Resolution:
- Exact Match Rate: measures whether the generated utterances exactly match the ground-truth utterances.
- BLEU: the BLEU score of generated utterances
- Resolution F1: comparing machine-generated words with ground-truth words only from the ellipsis/coreference part of user utterances.
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
see "Definitions of other metrics"
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
same as our dataset
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
Joint Accuracy, Inform Rate, Success Rate, BLEU Score and Combined Score on MultiWOZ and CrossWOZ dataset.
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
Gather human-to-human dialog in Chinese.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
Generate system response given dialogue context across multiple domains.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Crowdsourced`
#### Where was it crowdsourced?
<!-- info: If crowdsourced, where from? -->
<!-- scope: periscope -->
`Other crowdworker platform`
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
domains: Attraction, Restaurant, Hotel, Flight, Train, Weather, Movie, TV, Computer, Car, Hospital, Courses
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
validated by data curator
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
hybrid
#### Filter Criteria
<!-- info: What were the selection criteria? -->
<!-- scope: microscope -->
Rule-based and manual selection criteria
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
crowd-sourced
#### Number of Raters
<!-- info: What is the number of raters -->
<!-- scope: telescope -->
51<n<100
#### Rater Qualifications
<!-- info: Describe the qualifications required of an annotator. -->
<!-- scope: periscope -->
Chinese native speaker
#### Raters per Training Example
<!-- info: How many annotators saw each training example? -->
<!-- scope: periscope -->
3
#### Raters per Test Example
<!-- info: How many annotators saw each test example? -->
<!-- scope: periscope -->
3
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
#### Annotation Values
<!-- info: Purpose and values for each annotation -->
<!-- scope: microscope -->
- dialogue_id (string): dialogue ID
- goal (string): natural language descriptions of the user goal
- domains (list of strings): domains mentioned in current dialogue session
- turn_id (int): turn ID
- turn_domain (list of strings): domain mentioned in current turn
- belief_state (dict): dialogue state, including:
- inform slot-values (dict): the slots and corresponding values informed until current turn
- turn_inform (dict): the slots and corresponding values informed in current turn
- turn request (dict): the slots requested in current turn
- user_actions (list of lists): user dialogue acts in current turn
- user_actions (list of lists): system dialogue acts in current turn
- db_results (list of strings): database search results
- segmented_user_utterance (string): word segmentation result of user utterance
- segmented_system_utterance (string): word segmentation result of system utterance
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
unknown
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
yes
#### Consent Policy Details
<!-- info: What was the consent policy? -->
<!-- scope: microscope -->
Annotators agree using the dataset for research purpose.
#### Other Consented Downstream Use
<!-- info: What other downstream uses of the data did the original data creators and the data curators consent to? -->
<!-- scope: microscope -->
Any
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
#### Justification for no PII
<!-- info: Provide a justification for selecting `no PII` above. -->
<!-- scope: periscope -->
The slots and values as well as utterances do not contain any personal information.
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
yes
#### Maintenance Plan Details
<!-- info: Describe the original dataset's maintenance plan. -->
<!-- scope: microscope -->
Building a leaderboard webpage to trace and display the latest results on the [dataset](https://terryqj0107.github.io/RiSAWOZ_webpage/)
#### Maintainer Contact Information
<!-- info: Provide contact information of a person responsible for the dataset maintenance -->
<!-- scope: periscope -->
Deyi Xiong (dyxiong@tju.edu.cn)
#### Any Contestation Mechanism?
<!-- info: Does the maintenance plan include a contestation mechanism allowing individuals to request removal fo content? -->
<!-- scope: periscope -->
contact maintainer
#### Contestation Form Link
<!-- info: Provide the form link or contact information -->
<!-- scope: periscope -->
Deyi Xiong (dyxiong@tju.edu.cn)
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
yes
#### Details on how Dataset Addresses the Needs
<!-- info: Describe how this dataset addresses the needs of underserved communities. -->
<!-- scope: microscope -->
RiSAWOZ is the first large-scale multi-domain Chinese Wizard-of-Oz dataset with rich semantic annotations.
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
no
#### Are the Language Producers Representative of the Language?
<!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? -->
<!-- scope: periscope -->
yes
## Considerations for Using the Data
### PII Risks and Liability
#### Potential PII Risk
<!-- info: Considering your answers to the PII part of the Data Curation Section, describe any potential privacy to the data subjects and creators risks when using the dataset. -->
<!-- scope: microscope -->
None
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
None
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
Using the trained model on domains that are not included in the 12 domains selected for this dataset.
#### Discouraged Use Cases
<!-- info: What are some discouraged use cases of a model trained to maximize the proposed metrics on this dataset? In particular, think about settings where decisions made by a model that performs reasonably well on the metric my still have strong negative consequences for user or members of the public. -->
<!-- scope: microscope -->
Designing models that leverage unknown bias in the dataset to optimize specific metrics.
|
KrisPi/PythonTutor-LIMA-Finetune | ---
license: cc-by-nc-sa-4.0
---
Attempt to dataset for LIMA fine-tuning on the top of the Phind model, which would result in:
- New system prompt that will preference for using docstring under each function, use multiple functions even if it doesn't make sense, and comment on every line of the code, it should also greatly reduce explanations before and after code block.
- As a result model will improve readability by Junior Python Developers and additionally do step-by-step reasoning by default to improve code & HumanEval results.
Shuffled 1050 rows:
300 rows - LIMA Python Tutor
200 rows - LeetCode submissions
250 rows from Airoboros coding/python
150 rows from Airoboros orca
150 rows from Airboros COT+TOM
Originally fine-tuning was supposed to be based on:
https://huggingface.co/datasets/KrisPi/PythonTutor-Evol-1k-DPO-GPT4_vs_35
However here I ended up:
1. Formatted output for the new System Prompt I wanted to introduce:
df['text'] = df.apply(lambda row: f"### System Prompt\nYou are an intelligent assistant. Act as expert Python tutor, thoroughly commenting the code.\n\n### User Message\n{row['instruction']}\n\n### Assistant\n{row['gpt4_output']}", axis=1)3. Sampled 300 largest rows under 1024 tokens limit (Llama tokenizer)
2. Formated LeetCode submissions in Phind-like prompt format
https://github.com/Nan-Do/LeetCodeContestsDataset/
#Replace the instruction placeholder with actual instruction
new_instruction = f"Using Python, solve following challenge: {entry['input']}"
#Format the output with the given template
text_prompt = f"### System Prompt\nYou are an intelligent assistant. Always wrap output between ```python and ```. Only code nothing else.\n\n### User Message\n{new_instruction}\n\n### Assistant\n```python\n{entry['output']}\n```"
3. Tokenized each, filtered where Tokens > 1024
4. Sampled 200 largest outputs from the remaining rows
5. In a similar way, I formatted Airoboros 2.2.1 Dataset:
https://huggingface.co/datasets/jondurbin/airoboros-2.2.1
250 Largest rows under 1024 tokens, where: Category = coding, "python" in instruction, removed "PLAINFORMAT", if ```python & ``` in response I removed everything else, if missing ``` then whole response got wrapped in ```python & ```
150 Largest rows under 1024 tokens, where: Category = orca
150 Random rows under 1024 tokens, where Category = cot or theory_of_mind
|
CyberHarem/isla_fanart_plasticmemories | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of [Anime-based] Isla (Plastic Memories)
This is the dataset of [Anime-based] Isla (Plastic Memories), containing 130 images and their tags.
The core tags of this character are `long_hair, red_eyes, ahoge, twintails, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 130 | 171.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isla_fanart_plasticmemories/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 130 | 94.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isla_fanart_plasticmemories/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 321 | 209.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isla_fanart_plasticmemories/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 130 | 148.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isla_fanart_plasticmemories/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 321 | 294.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isla_fanart_plasticmemories/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/isla_fanart_plasticmemories',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, boots, detached_sleeves, employee_uniform, necktie, solo, white_footwear, skirt, looking_at_viewer |
| 1 | 9 |  |  |  |  |  | 1girl, detached_sleeves, solo, necktie, blush, employee_uniform, looking_at_viewer, smile, sitting |
| 2 | 7 |  |  |  |  |  | bangs, closed_mouth, collared_shirt, detached_sleeves, hair_between_eyes, red_necktie, sleeveless_shirt, 1girl, blush, long_sleeves, looking_at_viewer, solo, black_shirt, very_long_hair, simple_background, smile, white_background, white_shirt, white_skirt, bare_shoulders, uniform, upper_body |
| 3 | 5 |  |  |  |  |  | 1girl, coat, plaid_scarf, blush, red_scarf, solo, hair_between_eyes, closed_eyes, facing_viewer, long_sleeves, scarf_over_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | boots | detached_sleeves | employee_uniform | necktie | solo | white_footwear | skirt | looking_at_viewer | blush | smile | sitting | bangs | closed_mouth | collared_shirt | hair_between_eyes | red_necktie | sleeveless_shirt | long_sleeves | black_shirt | very_long_hair | simple_background | white_background | white_shirt | white_skirt | bare_shoulders | uniform | upper_body | coat | plaid_scarf | red_scarf | closed_eyes | facing_viewer | scarf_over_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------------|:-------------------|:----------|:-------|:-----------------|:--------|:--------------------|:--------|:--------|:----------|:--------|:---------------|:-----------------|:--------------------|:--------------|:-------------------|:---------------|:--------------|:-----------------|:--------------------|:-------------------|:--------------|:--------------|:-----------------|:----------|:-------------|:-------|:--------------|:------------|:--------------|:----------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | X | | | | X | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X |
|
DeepLearner101/ImageNetSubset_2KI10C | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 96807080.0
num_examples: 2560
- name: validation
num_bytes: 18687118.0
num_examples: 500
download_size: 115447250
dataset_size: 115494198.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
torchgeo/l7sparcs | ---
license: cc0-1.0
task_categories:
- image-segmentation
tags:
- climate
pretty_name: L7 SPARCS
size_categories:
- n<1K
---
Dataset redistributed without change with permission from the author. If you use this dataset in your research, please cite the following paper: https://doi.org/10.3390/rs6064907 |
manoh2f2/tsterbak-lyrics-dataset-with-emotions | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: artist
dtype: string
- name: seq
dtype: string
- name: song
dtype: string
- name: emotions
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 50952563
num_examples: 36897
download_size: 21806915
dataset_size: 50952563
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheGreatRambler/mm2_level_played | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 1B<n<10B
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 level plays
tags:
- text-mining
---
# Mario Maker 2 level plays
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 level plays dataset consists of 1 billion level plays from Nintendo's online service totaling around 20GB of data. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
The Mario Maker 2 level plays dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_level_played", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'data_id': 3000004,
'pid': '6382913755133534321',
'cleared': 1,
'liked': 0
}
```
Each row is a unique play in the level denoted by the `data_id` done by the player denoted by the `pid`, `pid` is a 64 bit integer stored within a string from database limitations. `cleared` and `liked` denote if the player successfully cleared the level during their play and/or liked the level during their play. Every level has only one unique play per player.
You can also download the full dataset. Note that this will download ~20GB:
```python
ds = load_dataset("TheGreatRambler/mm2_level_played", split="train")
```
## Data Structure
### Data Instances
```python
{
'data_id': 3000004,
'pid': '6382913755133534321',
'cleared': 1,
'liked': 0
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|data_id|int|The data ID of the level this play occured in|
|pid|string|Player ID of the player|
|cleared|bool|Whether the player cleared the level during their play|
|liked|bool|Whether the player liked the level during their play|
### Data Splits
The dataset only contains a train split.
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset contains no harmful language or depictions.
|
MyBad2K/LLama_pymes2_7b_TrainTest | ---
license: unknown
---
|
bene-ges/en_gtn_vocab | ---
license: cc-by-sa-4.0
language:
- en
size_categories:
- 1M<n<10M
tags:
- text normalization
- inverse text normalization
---
This is a vocabulary of (semiotic_class, normalized, non-normalized, freq) tuples from [Google Text Normalization Dataset](https://www.kaggle.com/datasets/richardwilliamsproat/text-normalization-for-english-russian-and-polish).
It was generated using [this script](https://github.com/NVIDIA/NeMo/blob/main/examples/nlp/text_normalization_as_tagging/evaluation/get_multi_reference_vocab.py).
It can be used to perform fast text normalization, see example in this [script](https://github.com/bene-ges/nemo_compatible/blob/main/scripts/nlp/en_spellmapper/dataset_preparation/normalize_by_gtn_vocab.py)
|
lucyd/fcc-part15 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 556589
num_examples: 1648
download_size: 242959
dataset_size: 556589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mrpc_those_them | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 9198
num_examples: 34
- name: train
num_bytes: 20933
num_examples: 77
- name: validation
num_bytes: 1166
num_examples: 4
download_size: 32088
dataset_size: 31297
---
# Dataset Card for "MULTI_VALUE_mrpc_those_them"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZiHDeng/hf-ny8-v2 | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1420530
num_examples: 7660
download_size: 144006
dataset_size: 1420530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Birchlabs/openai-prm800k-phase1_test-stepwise-critique | ---
license: mit
---
|
cellos/DomesticNames_AllStates_Text | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- 100K<n<1M
annotations_creators:
- {creator} # Example: found
source_datasets:
- {source_dataset_0} # All names data products are extracted from the Geographic Names Information System (GNIS), the Federal Government's repository of official geographic names.
---
<p align="center"><h1> Domestic Names from the Federal Government's repository of official geographic names [CSV dataset]</h1></p>
This Dataset includes 980,065 geographic names as of September 10, 2023.
It is apparent that no currently released LLMs are pretrained on datasets with many of these geographic names (i.e., features), descriptions, and histories.
Example: *feature_name: Abercrombie Gulch*
- GPT-3.5 responds "I'm not aware of a specific location called Abercrombie Gulch in my training data,..." when prompted about this name.
- Here is the description found in this Dataset about that name. "Abercrombie Gulch trends N-S 3 mi. on a tributary of Lowe River, 2.5 mi. S of Valdez; Chugach Mts. Named in 1911 by J. W. Bagley, and F. E. Giffin, U.S. Geological Survey (USGS), for Captain William R. Abercrombie, USA, who explored this area in 1884 and 1898-99."
This Dataset also includes geographic point locations.
All domestic names are extracted from the Geographic Names Information System (GNIS), the Federal Government's repository of official geographic names. This Dataset is deduped, zero values are changed to null, and the sort order is randomized. This Dataset combines 56 domestic names files from [here](https://prd-tnm.s3.amazonaws.com/index.html?prefix=StagedProducts/GeographicNames/DomesticNames/) and then merged with description and history for historical features when found in **HistoricalFeatures_National_Text** [here](https://prd-tnm.s3.amazonaws.com/index.html?prefix=StagedProducts/GeographicNames/Topical/).
|
MITCriticalData/Sentinel_L1C_2016_2021 | ---
license: mit
---
|
open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ | ---
pretty_name: Evaluation run of TheBloke/Lemur-70B-Chat-v1-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Lemur-70B-Chat-v1-GPTQ](https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T06:46:13.725525](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ/blob/main/results_2023-08-31T06%3A46%3A13.725525.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6468074911221942,\n\
\ \"acc_stderr\": 0.03281612856930076,\n \"acc_norm\": 0.6509040444920074,\n\
\ \"acc_norm_stderr\": 0.032790646231639874,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5711470281396481,\n\
\ \"mc2_stderr\": 0.015283087726691595\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670724,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620446\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n\
\ \"acc_stderr\": 0.00476747536668976,\n \"acc_norm\": 0.8440549691296555,\n\
\ \"acc_norm_stderr\": 0.003620617550747387\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228426,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228426\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634611,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634611\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976064,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5072625698324023,\n\
\ \"acc_stderr\": 0.0167207374051795,\n \"acc_norm\": 0.5072625698324023,\n\
\ \"acc_norm_stderr\": 0.0167207374051795\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279046,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279046\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49934810951760106,\n\
\ \"acc_stderr\": 0.012770225252255534,\n \"acc_norm\": 0.49934810951760106,\n\
\ \"acc_norm_stderr\": 0.012770225252255534\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.038913644958358175,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.038913644958358175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5711470281396481,\n\
\ \"mc2_stderr\": 0.015283087726691595\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:46:13.725525.parquet'
- config_name: results
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- results_2023-08-31T06:46:13.725525.parquet
- split: latest
path:
- results_2023-08-31T06:46:13.725525.parquet
---
# Dataset Card for Evaluation run of TheBloke/Lemur-70B-Chat-v1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Lemur-70B-Chat-v1-GPTQ](https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T06:46:13.725525](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ/blob/main/results_2023-08-31T06%3A46%3A13.725525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6468074911221942,
"acc_stderr": 0.03281612856930076,
"acc_norm": 0.6509040444920074,
"acc_norm_stderr": 0.032790646231639874,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5711470281396481,
"mc2_stderr": 0.015283087726691595
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670724,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620446
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.00476747536668976,
"acc_norm": 0.8440549691296555,
"acc_norm_stderr": 0.003620617550747387
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.025670080636909186,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.025670080636909186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228426,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228426
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634611,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634611
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976064,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5072625698324023,
"acc_stderr": 0.0167207374051795,
"acc_norm": 0.5072625698324023,
"acc_norm_stderr": 0.0167207374051795
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279046,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279046
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49934810951760106,
"acc_stderr": 0.012770225252255534,
"acc_norm": 0.49934810951760106,
"acc_norm_stderr": 0.012770225252255534
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.038913644958358175,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.038913644958358175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5711470281396481,
"mc2_stderr": 0.015283087726691595
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/ruby_oshinoko | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ruby
This is the dataset of Ruby, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 455 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 455 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 455 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 455 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
linhtran92/viet_youtube_asr_corpus_v2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: w2v2_transcription
dtype: string
- name: WER
dtype: int64
- name: sum
dtype: float64
splits:
- name: train
num_bytes: 16003127322.525991
num_examples: 194747
- name: test
num_bytes: 1778161779.8073397
num_examples: 21639
download_size: 17552533348
dataset_size: 17781289102.333332
---
# Dataset Card for "viet_youtube_asr_corpus_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dotan1111/MSA-amino-8-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
suolyer/pile_ubuntu-irc | ---
license: apache-2.0
---
|
Nexdata/American_English_Speech_Data_by_Mobile_Phone_Reading | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/American_English_Speech_Data_by_Mobile_Phone_Reading
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/78?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data set contains 349 American English speakers' speech data, all of whom are American locals. It is recorded in quiet environment. The recording contents cover various categories like economics, entertainment, news and spoken language. It is manually transcribed and annotated with the starting and ending time points.
For more details, please refer to the link: https://www.nexdata.ai/datasets/78?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
American English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
relbert/conceptnet | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- n<1K
pretty_name: relbert/conceptnet
---
# Dataset Card for "relbert/conceptnet"
## Dataset Description
- **Repository:** [RelBERT](https://github.com/asahi417/relbert)
- **Paper:** [https://home.ttic.edu/~kgimpel/commonsense.html](https://home.ttic.edu/~kgimpel/commonsense.html)
- **Dataset:** High Confidence Subset of ConceptNet for link prediction
### Dataset Summary
The selected subset of ConceptNet used in [this work](https://home.ttic.edu/~kgimpel/commonsense.html).
We removed `NotCapableOf` and `NotDesires` to keep the positive relation only.
We consider the original test set as test set, dev1 as the training set, and dev2 as the validation set.
- Number of instances
| | train | validation | test |
|:--------------------------------|--------:|-------------:|-------:|
| number of pairs | 583082 | 1184 | 1187 |
| number of unique relation types | 28 | 20 | 19 |
- Number of pairs in each relation type
| | number of pairs (train) | number of pairs (validation) | number of pairs (test) |
|:-----------------|--------------------------:|-------------------------------:|-------------------------:|
| AtLocation | 69838 | 230 | 250 |
| CapableOf | 71840 | 124 | 144 |
| Causes | 34732 | 52 | 45 |
| CausesDesire | 9616 | 15 | 5 |
| CreatedBy | 534 | 1 | 2 |
| DefinedAs | 11048 | 2 | 1 |
| DesireOf | 28 | 0 | 0 |
| Desires | 8960 | 20 | 8 |
| HasA | 19234 | 43 | 41 |
| HasFirstSubevent | 7350 | 2 | 1 |
| HasLastSubevent | 5916 | 5 | 0 |
| HasPainCharacter | 2 | 0 | 0 |
| HasPainIntensity | 2 | 0 | 0 |
| HasPrerequisite | 47298 | 116 | 109 |
| HasProperty | 36610 | 63 | 70 |
| HasSubevent | 52468 | 82 | 83 |
| InheritsFrom | 112 | 0 | 0 |
| InstanceOf | 138 | 0 | 0 |
| IsA | 71034 | 197 | 211 |
| LocatedNear | 6 | 0 | 0 |
| LocationOfAction | 6 | 0 | 0 |
| MadeOf | 1518 | 10 | 14 |
| MotivatedByGoal | 23668 | 17 | 8 |
| PartOf | 5402 | 19 | 22 |
| ReceivesAction | 20656 | 15 | 11 |
| RelatedTo | 178 | 0 | 1 |
| SymbolOf | 328 | 2 | 0 |
| UsedFor | 84560 | 169 | 161 |
## Dataset Structure
An example of `train` looks as follows.
```shell
{
"relation": "IsA",
"head": "baseball",
"tail": "sport"
}
```
## Citation Information
```
@InProceedings{P16-1137,
author = "Li, Xiang
and Taheri, Aynaz
and Tu, Lifu
and Gimpel, Kevin",
title = "Commonsense Knowledge Base Completion",
booktitle = "Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) ",
year = "2016",
publisher = "Association for Computational Linguistics",
pages = "1445--1455",
location = "Berlin, Germany",
doi = "10.18653/v1/P16-1137",
url = "http://aclweb.org/anthology/P16-1137"
}
``` |
Vengal/new_dataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct | ---
pretty_name: Evaluation run of togethercomputer/Llama-2-7B-32K-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T05:26:17.383848](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct/blob/main/results_2023-10-25T05-26-17.383848.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219147,\n \"f1\": 0.0664240771812081,\n\
\ \"f1_stderr\": 0.001541328214347489,\n \"acc\": 0.3877489045119805,\n\
\ \"acc_stderr\": 0.009164612305415161\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219147,\n\
\ \"f1\": 0.0664240771812081,\n \"f1_stderr\": 0.001541328214347489\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \
\ \"acc_stderr\": 0.005829898355937195\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n\
\ }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T03_32_33.119122
path:
- '**/details_harness|drop|3_2023-10-25T03-32-33.119122.parquet'
- split: 2023_10_25T05_26_17.383848
path:
- '**/details_harness|drop|3_2023-10-25T05-26-17.383848.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T05-26-17.383848.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T03_32_33.119122
path:
- '**/details_harness|gsm8k|5_2023-10-25T03-32-33.119122.parquet'
- split: 2023_10_25T05_26_17.383848
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-26-17.383848.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-26-17.383848.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T03_32_33.119122
path:
- '**/details_harness|winogrande|5_2023-10-25T03-32-33.119122.parquet'
- split: 2023_10_25T05_26_17.383848
path:
- '**/details_harness|winogrande|5_2023-10-25T05-26-17.383848.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T05-26-17.383848.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- results_2023-10-04T00-24-39.163717.parquet
- split: 2023_10_25T03_32_33.119122
path:
- results_2023-10-25T03-32-33.119122.parquet
- split: 2023_10_25T05_26_17.383848
path:
- results_2023-10-25T05-26-17.383848.parquet
- split: latest
path:
- results_2023-10-25T05-26-17.383848.parquet
---
# Dataset Card for Evaluation run of togethercomputer/Llama-2-7B-32K-Instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T05:26:17.383848](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct/blob/main/results_2023-10-25T05-26-17.383848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219147,
"f1": 0.0664240771812081,
"f1_stderr": 0.001541328214347489,
"acc": 0.3877489045119805,
"acc_stderr": 0.009164612305415161
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219147,
"f1": 0.0664240771812081,
"f1_stderr": 0.001541328214347489
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.005829898355937195
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Cubpaw/test_path_gen_voxelgym_3c_100_42x42 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
- name: path_label
dtype: image
- name: path_rgb_label
dtype: image
splits:
- name: train
num_bytes: 71974.0
num_examples: 80
- name: validation
num_bytes: 18401.0
num_examples: 20
download_size: 88974
dataset_size: 90375.0
---
# Dataset Card for "test_path_gen_voxelgym_3c_100_42x42"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rjhuang/my_wb_preferences | ---
license: apache-2.0
task_categories:
- text-classification
language:
- zh
tags:
- social
dataset_info:
features:
- name: author
dtype: string
- name: content
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 156400
num_examples: 417
download_size: 100117
dataset_size: 156400
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vwxyzjn/cai-conversation-dev1-h4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 35674961.23498535
num_examples: 13107
- name: test
num_bytes: 8919420.765014648
num_examples: 3277
download_size: 20673986
dataset_size: 44594382.0
---
# Dataset Card for "cai-conversation-dev1-h4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
darsanaraja/raga2 | ---
license: cc-by-sa-4.0
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
--- |
open-llm-leaderboard/details_Steelskull__Aurora_base_test | ---
pretty_name: Evaluation run of Steelskull/Aurora_base_test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Steelskull/Aurora_base_test](https://huggingface.co/Steelskull/Aurora_base_test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Aurora_base_test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T14:10:28.609924](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aurora_base_test/blob/main/results_2024-01-21T14-10-28.609924.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.601064189469788,\n\
\ \"acc_stderr\": 0.033290989395204654,\n \"acc_norm\": 0.606869148379456,\n\
\ \"acc_norm_stderr\": 0.03397539939220347,\n \"mc1\": 0.5312117503059975,\n\
\ \"mc1_stderr\": 0.01746936487457753,\n \"mc2\": 0.6783801405582048,\n\
\ \"mc2_stderr\": 0.015256900573808395\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142825\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n\
\ \"acc_stderr\": 0.00475342980664544,\n \"acc_norm\": 0.8398725353515236,\n\
\ \"acc_norm_stderr\": 0.0036597474762410597\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"\
acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968351,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968351\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335839,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335839\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.015652542496421125,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.015652542496421125\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.02685729466328141,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.02685729466328141\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489906,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489906\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5312117503059975,\n\
\ \"mc1_stderr\": 0.01746936487457753,\n \"mc2\": 0.6783801405582048,\n\
\ \"mc2_stderr\": 0.015256900573808395\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3252463987869598,\n \
\ \"acc_stderr\": 0.012903904752543927\n }\n}\n```"
repo_url: https://huggingface.co/Steelskull/Aurora_base_test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|arc:challenge|25_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|gsm8k|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hellaswag|10_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T14-10-28.609924.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- '**/details_harness|winogrande|5_2024-01-21T14-10-28.609924.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T14-10-28.609924.parquet'
- config_name: results
data_files:
- split: 2024_01_21T14_10_28.609924
path:
- results_2024-01-21T14-10-28.609924.parquet
- split: latest
path:
- results_2024-01-21T14-10-28.609924.parquet
---
# Dataset Card for Evaluation run of Steelskull/Aurora_base_test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Steelskull/Aurora_base_test](https://huggingface.co/Steelskull/Aurora_base_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Steelskull__Aurora_base_test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T14:10:28.609924](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aurora_base_test/blob/main/results_2024-01-21T14-10-28.609924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.601064189469788,
"acc_stderr": 0.033290989395204654,
"acc_norm": 0.606869148379456,
"acc_norm_stderr": 0.03397539939220347,
"mc1": 0.5312117503059975,
"mc1_stderr": 0.01746936487457753,
"mc2": 0.6783801405582048,
"mc2_stderr": 0.015256900573808395
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142825
},
"harness|hellaswag|10": {
"acc": 0.6520613423620792,
"acc_stderr": 0.00475342980664544,
"acc_norm": 0.8398725353515236,
"acc_norm_stderr": 0.0036597474762410597
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968351,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968351
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335839,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335839
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421125,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.02685729466328141,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.02685729466328141
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489906,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5312117503059975,
"mc1_stderr": 0.01746936487457753,
"mc2": 0.6783801405582048,
"mc2_stderr": 0.015256900573808395
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
},
"harness|gsm8k|5": {
"acc": 0.3252463987869598,
"acc_stderr": 0.012903904752543927
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
316usman/thematic2dembed | ---
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 22244750
num_examples: 29359
download_size: 6099068
dataset_size: 22244750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
janck/bigscience-lama | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
trex:
- 1M<n<10M
task_categories:
- text-retrieval
- text-classification
task_ids:
- fact-checking-retrieval
- text-scoring
paperswithcode_id: lama
pretty_name: 'LAMA: LAnguage Model Analysis - BigScience version'
tags:
- probing
---
# Dataset Card for LAMA: LAnguage Model Analysis - a dataset for probing and analyzing the factual and commonsense knowledge contained in pretrained language models.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:**
https://github.com/facebookresearch/LAMA
- **Repository:**
https://github.com/facebookresearch/LAMA
- **Paper:**
@inproceedings{petroni2019language,
title={Language Models as Knowledge Bases?},
author={F. Petroni, T. Rockt{\"{a}}schel, A. H. Miller, P. Lewis, A. Bakhtin, Y. Wu and S. Riedel},
booktitle={In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019},
year={2019}
}
@inproceedings{petroni2020how,
title={How Context Affects Language Models' Factual Predictions},
author={Fabio Petroni and Patrick Lewis and Aleksandra Piktus and Tim Rockt{\"a}schel and Yuxiang Wu and Alexander H. Miller and Sebastian Riedel},
booktitle={Automated Knowledge Base Construction},
year={2020},
url={https://openreview.net/forum?id=025X0zPfn}
}
### Dataset Summary
This dataset provides the data for LAMA. This dataset only contains TRex
(subset of wikidata triples).
The dataset includes some cleanup, and addition of a masked sentence
and associated answers for the [MASK] token. The accuracy in
predicting the [MASK] token shows how well the language model knows
facts and common sense information. The [MASK] tokens are only for the
"object" slots.
This version also contains questions instead of templates that can be used to probe also non-masking models.
See the paper for more details. For more information, also see:
https://github.com/facebookresearch/LAMA
### Languages
en
## Dataset Structure
### Data Instances
The trex config has the following fields:
``
{'uuid': 'a37257ae-4cbb-4309-a78a-623036c96797', 'sub_label': 'Pianos Become the Teeth', 'predicate_id': 'P740', 'obj_label': 'Baltimore', 'template': '[X] was founded in [Y] .', 'type': 'N-1', 'question': 'Where was [X] founded?'}
34039
``
### Data Splits
There are no data splits.
## Dataset Creation
### Curation Rationale
This dataset was gathered and created to probe what language models understand.
### Source Data
#### Initial Data Collection and Normalization
See the reaserch paper and website for more detail. The dataset was
created gathered from various other datasets with cleanups for probing.
#### Who are the source language producers?
The LAMA authors and the original authors of the various configs.
### Annotations
#### Annotation process
Human annotations under the original datasets (conceptnet), and various machine annotations.
#### Who are the annotators?
Human annotations and machine annotations.
### Personal and Sensitive Information
Unkown, but likely names of famous people.
## Considerations for Using the Data
### Social Impact of Dataset
The goal for the work is to probe the understanding of language models.
### Discussion of Biases
Since the data is from human annotators, there is likely to be baises.
[More Information Needed]
### Other Known Limitations
The original documentation for the datafields are limited.
## Additional Information
### Dataset Curators
The authors of LAMA at Facebook and the authors of the original datasets.
### Licensing Information
The Creative Commons Attribution-Noncommercial 4.0 International License. see https://github.com/facebookresearch/LAMA/blob/master/LICENSE
### Citation Information
@inproceedings{petroni2019language,
title={Language Models as Knowledge Bases?},
author={F. Petroni, T. Rockt{\"{a}}schel, A. H. Miller, P. Lewis, A. Bakhtin, Y. Wu and S. Riedel},
booktitle={In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019},
year={2019}
}
@inproceedings{petroni2020how,
title={How Context Affects Language Models' Factual Predictions},
author={Fabio Petroni and Patrick Lewis and Aleksandra Piktus and Tim Rockt{\"a}schel and Yuxiang Wu and Alexander H. Miller and Sebastian Riedel},
booktitle={Automated Knowledge Base Construction},
year={2020},
url={https://openreview.net/forum?id=025X0zPfn}
}
|
CyberHarem/hellagur_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hellagur_arknights
This is the dataset of hellagur_arknights, containing 93 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 93 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 203 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 93 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 93 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 93 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 93 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 93 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 203 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 203 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 203 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Frixi/Duke_Nukem_Gianni | ---
license: openrail
---
|
codemurt/uyghur_ner_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: extra
path: data/extra-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: langs
sequence: string
- name: spans
sequence: string
splits:
- name: train
num_bytes: 538947
num_examples: 473
- name: extra
num_bytes: 11497
num_examples: 109
download_size: 140314
dataset_size: 550444
license: mit
task_categories:
- token-classification
language:
- ug
size_categories:
- n<1K
---
# Uyghur NER dataset
## Description
This dataset is in [WikiAnn](https://huggingface.co/datasets/wikiann) format. The dataset is assembled from named entities parsed from Wikipedia, Wiktionary and Dbpedia. For some words, new case forms have been created using [Apertium-uig](https://github.com/apertium/apertium-uig). Some locations have been translated using the Google Translate API.
The dataset is divided into two parts: `train` and `extra`. `Train` has full sentences, `extra` has only named entities.
Tags: `O (0), B-PER (1), I-PER (2), B-ORG (3), I-ORG (4), B-LOC (5), I-LOC (6)`
## Data example
```
{
'tokens': ['قاراماي', 'شەھىرى', '«مەملىكەت', 'بويىچە', 'مىللەتل…'],
'ner_tags': [5, 0, 0, 0, 0],
'langs': ['ug', 'ug', 'ug', 'ug', 'ug'],
'spans': ['LOC: قاراماي']
}
```
## Usage with `datasets` library
```py
from datasets import load_dataset
dataset = load_dataset("codemurt/uyghur_ner_dataset")
``` |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v1-math-6c03d1-1913164909 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v1
eval_info:
task: text_zero_shot_classification
model: facebook/opt-66b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v1
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v1
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-66b
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v1
* Config: mathemakitten--winobias_antistereotype_test_cot_v1
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
freshpearYoon/train_free_46 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604558432
num_examples: 10000
download_size: 1154577723
dataset_size: 9604558432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b | ---
pretty_name: Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [indischepartij/OpenMia-Indo-Engineering-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T07:22:15.500441](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b/blob/main/results_2024-02-05T07-22-15.500441.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324060235996815,\n\
\ \"acc_stderr\": 0.03231686157638383,\n \"acc_norm\": 0.6331079164519123,\n\
\ \"acc_norm_stderr\": 0.032980184119772604,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n\
\ \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.0137249784655373\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6482772356104362,\n\
\ \"acc_stderr\": 0.004765320784902126,\n \"acc_norm\": 0.8501294562836088,\n\
\ \"acc_norm_stderr\": 0.0035621498909627174\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015062,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015062\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n\
\ \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \
\ \"acc_stderr\": 0.013146945941397222\n }\n}\n```"
repo_url: https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|arc:challenge|25_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|gsm8k|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hellaswag|10_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T07-22-15.500441.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- '**/details_harness|winogrande|5_2024-02-05T07-22-15.500441.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T07-22-15.500441.parquet'
- config_name: results
data_files:
- split: 2024_02_05T07_22_15.500441
path:
- results_2024-02-05T07-22-15.500441.parquet
- split: latest
path:
- results_2024-02-05T07-22-15.500441.parquet
---
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Engineering-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T07:22:15.500441](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b/blob/main/results_2024-02-05T07-22-15.500441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6324060235996815,
"acc_stderr": 0.03231686157638383,
"acc_norm": 0.6331079164519123,
"acc_norm_stderr": 0.032980184119772604,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.0137249784655373
},
"harness|hellaswag|10": {
"acc": 0.6482772356104362,
"acc_stderr": 0.004765320784902126,
"acc_norm": 0.8501294562836088,
"acc_norm_stderr": 0.0035621498909627174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015062,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015062
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918747
},
"harness|gsm8k|5": {
"acc": 0.6489764973464746,
"acc_stderr": 0.013146945941397222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wal14567/test_xray_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 979103611.0
num_examples: 400
download_size: 979155341
dataset_size: 979103611.0
---
# Dataset Card for "test_xray_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/screamimg_woman_portrait_prompts_sdxl | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 815950498
num_examples: 2000000
download_size: 131182947
dataset_size: 815950498
---
# Dataset Card for "screamimg_woman_portrait_prompts_sdxl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Simaosantos/cremosos | ---
license: openrail
---
|
zhengzhongliang/pubmed_causal_relation_pretrain_pure_text | ---
license: cc-by-nc-sa-4.0
---
|
joey234/mmlu-human_sexuality-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 32615
num_examples: 131
download_size: 22825
dataset_size: 32615
---
# Dataset Card for "mmlu-human_sexuality-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
celsowm/adoro_cinema_filmes | ---
dataset_info:
features:
- name: titulo
dtype: string
- name: sinopse
dtype: string
- name: generos
sequence: string
- name: link
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 23369140
num_examples: 42918
download_size: 13807632
dataset_size: 23369140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "adoro_cinema_filmes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ServiceNow/hotpot_test_pos__4_5 | ---
dataset_info:
features:
- name: context
dtype: string
- name: contexts_list
sequence: string
- name: titles_list
sequence: string
- name: useful_contexts
sequence: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: sample_idx
dtype: int64
- name: dataset
dtype: string
splits:
- name: test
num_bytes: 255384789
num_examples: 22215
download_size: 150788299
dataset_size: 255384789
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
edcarlossilva/vozdoedcarlos | ---
license: openrail
---
|
navinaananthan/Dhivehi-English-ParallelCorpus | ---
language:
- dv
--- |
brayanfs/CelsoRussomano | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_194 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1008210908.0
num_examples: 197999
download_size: 1020408217
dataset_size: 1008210908.0
---
# Dataset Card for "chunk_194"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712979932 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 4091
num_examples: 9
download_size: 7757
dataset_size: 4091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712979932"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nayohan/022_summary_report_3sent | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: summary_extend
dtype: string
splits:
- name: train
num_bytes: 220466616
num_examples: 82490
download_size: 126634967
dataset_size: 220466616
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FreedomIntelligence/2023_Pharmacist_Licensure_Examination-TCM_track | ---
license: apache-2.0
---
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Traditional Chinese Medicine (TCM) Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination.
- **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II |
sanaeai/ar_sarcasm_v21 | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 2499149
num_examples: 15548
download_size: 1424196
dataset_size: 2499149
---
# Dataset Card for "ar_sarcasm_v21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arianhosseini/trl_data | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 36162393
num_examples: 20000
download_size: 20623731
dataset_size: 36162393
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
livinNector/ner_naamapadam | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 186464659
num_examples: 497882
- name: test
num_bytes: 319747
num_examples: 758
- name: validation
num_bytes: 1046343
num_examples: 2795
download_size: 48623255
dataset_size: 187830749
---
# Dataset Card for "ner_naamapadam"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-7b-hf-gpt-3.5-80k | ---
pretty_name: Evaluation run of XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k](https://huggingface.co/XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-7b-hf-gpt-3.5-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T17:06:23.897060](https://huggingface.co/datasets/open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-7b-hf-gpt-3.5-80k/blob/main/results_2024-03-21T17-06-23.897060.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4614904399129846,\n\
\ \"acc_stderr\": 0.03449043206550789,\n \"acc_norm\": 0.4663446500271822,\n\
\ \"acc_norm_stderr\": 0.035268921719628615,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4142436131049865,\n\
\ \"mc2_stderr\": 0.01485881406178638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813283,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.014568245550296358\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.563931487751444,\n\
\ \"acc_stderr\": 0.004948824501355489,\n \"acc_norm\": 0.7575184226249752,\n\
\ \"acc_norm_stderr\": 0.0042770811502584665\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4967741935483871,\n\
\ \"acc_stderr\": 0.02844341422643833,\n \"acc_norm\": 0.4967741935483871,\n\
\ \"acc_norm_stderr\": 0.02844341422643833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.03308530426228258,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.03308530426228258\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688932,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688932\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5505050505050505,\n \"acc_stderr\": 0.035441324919479704,\n \"\
acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.035441324919479704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658751,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658751\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329882,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329882\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5049019607843137,\n \"acc_stderr\": 0.03509143375606785,\n \"\
acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.03509143375606785\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811225,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811225\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889118,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6232439335887612,\n\
\ \"acc_stderr\": 0.01732829290730305,\n \"acc_norm\": 0.6232439335887612,\n\
\ \"acc_norm_stderr\": 0.01732829290730305\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.02773125864701199,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.02773125864701199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49382716049382713,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.49382716049382713,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n\
\ \"acc_stderr\": 0.012198140605353605,\n \"acc_norm\": 0.3520208604954368,\n\
\ \"acc_norm_stderr\": 0.012198140605353605\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.033455630703391914,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.033455630703391914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4142436131049865,\n\
\ \"mc2_stderr\": 0.01485881406178638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7213891081294396,\n \"acc_stderr\": 0.012599896649493882\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1463229719484458,\n \
\ \"acc_stderr\": 0.009735210557785264\n }\n}\n```"
repo_url: https://huggingface.co/XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-06-23.897060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-06-23.897060.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- '**/details_harness|winogrande|5_2024-03-21T17-06-23.897060.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T17-06-23.897060.parquet'
- config_name: results
data_files:
- split: 2024_03_21T17_06_23.897060
path:
- results_2024-03-21T17-06-23.897060.parquet
- split: latest
path:
- results_2024-03-21T17-06-23.897060.parquet
---
# Dataset Card for Evaluation run of XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k](https://huggingface.co/XuanXuanXuanXuan/Llama-2-7b-hf-gpt-3.5-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-7b-hf-gpt-3.5-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T17:06:23.897060](https://huggingface.co/datasets/open-llm-leaderboard/details_XuanXuanXuanXuan__Llama-2-7b-hf-gpt-3.5-80k/blob/main/results_2024-03-21T17-06-23.897060.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4614904399129846,
"acc_stderr": 0.03449043206550789,
"acc_norm": 0.4663446500271822,
"acc_norm_stderr": 0.035268921719628615,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4142436131049865,
"mc2_stderr": 0.01485881406178638
},
"harness|arc:challenge|25": {
"acc": 0.5008532423208191,
"acc_stderr": 0.014611369529813283,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.014568245550296358
},
"harness|hellaswag|10": {
"acc": 0.563931487751444,
"acc_stderr": 0.004948824501355489,
"acc_norm": 0.7575184226249752,
"acc_norm_stderr": 0.0042770811502584665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643833,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.03308530426228258,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.03308530426228258
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688932,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688932
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5505050505050505,
"acc_stderr": 0.035441324919479704,
"acc_norm": 0.5505050505050505,
"acc_norm_stderr": 0.035441324919479704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658751,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658751
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329882,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329882
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.03509143375606785,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.03509143375606785
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811225,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811225
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889118,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6232439335887612,
"acc_stderr": 0.01732829290730305,
"acc_norm": 0.6232439335887612,
"acc_norm_stderr": 0.01732829290730305
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.02773125864701199,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.02773125864701199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49382716049382713,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.49382716049382713,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.012198140605353605,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.012198140605353605
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.033455630703391914,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.033455630703391914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4142436131049865,
"mc2_stderr": 0.01485881406178638
},
"harness|winogrande|5": {
"acc": 0.7213891081294396,
"acc_stderr": 0.012599896649493882
},
"harness|gsm8k|5": {
"acc": 0.1463229719484458,
"acc_stderr": 0.009735210557785264
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Rodr16020/Bactrian-Spanish-Clean-Light | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: id
dtype: string
- name: output
dtype: string
- name: instruction_text
dtype: string
splits:
- name: train
num_bytes: 5191106
num_examples: 3000
download_size: 2646581
dataset_size: 5191106
---
# Dataset Card for "Bactrian-Spanish-Clean-Light"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
berquetR/dlab_project_optimal_links | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: current_page
dtype: string
- name: current_page_links
dtype: string
- name: next_page
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 128942469
num_examples: 78088
- name: validation
num_bytes: 43511265
num_examples: 26178
- name: test
num_bytes: 43295032
num_examples: 26193
download_size: 32314400
dataset_size: 215748766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
kalyan003/full_text | ---
license: unknown
---
|
lh0109/PyGeneration | ---
license: mit
---
|
guetLzy/genshin | ---
license: cc
---
|
nandobiassusijr/gilson001 | ---
license: openrail
---
|
rajat-relove/test | ---
license: mit
task_categories:
- text-to-image
language:
- hi
tags:
- fashion
size_categories:
- 1K<n<10K
--- |
sdadas/gpt-exams | ---
language:
- pl
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
task_categories:
- question-answering
pretty_name: GPT-exams
dataset_info:
features:
- name: _id
dtype: int32
- name: question
dtype: string
- name: answer
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 17237681
num_examples: 8131
---
# GPT-exams
### Dataset summary
The dataset contains 8131 multi-domain question-answer pairs. It was created semi-automatically using the `gpt-3.5-turbo-0613` model available in the OpenAI API. The process of building the dataset was as follows:
1. We manually prepared a list of 409 university-level courses from various fields. For each course, we instructed the model with the prompt: "Wygeneruj 20 przykładowych pytań na egzamin z [nazwa przedmiotu]" (Generate 20 sample questions for the [course name] exam).
2. We then parsed the outputs of the model to extract individual questions and performed their deduplication.
3. In the next step, we requested the model to generate the answer to each of the collected questions. We used the following prompt: "Odpowiedz na następujące pytanie z dziedziny [nazwa przedmiotu]: [treść pytania]" (Answer the following question from [course name]: [question content]). Along with the prompt, we also sent the following system message: "Jesteś ekspertem w dziedzinie [nazwa przedmiotu]. Udzielasz specjalistycznych i wyczerpujących odpowiedzi na pytania." (You are an expert in [course name]. You provide knowledgeable and comprehensive answers to questions).
4. In the last step, we manually removed from the dataset the cases in which the model refused to answer the question. We searched for occurrences of phrases such as "model języka" (language model), "nie jestem" (I'm not), or "nie mogę" (I can't).
### Data Instances
Example instance:
```
{
"_id": 2338,
"domain": "wzorców projektowych w oprogramowaniu",
"question": "Co to jest dependency injection i jak może być wykorzystane w kontekście wzorców projektowych?",
"answer": "Dependency injection (DI) to technika wstrzykiwania zależności, która polega na dostarczaniu obiektowi (...)"
}
```
### Data Fields
- _id: record id
- question: question text
- answer: answer text
- domain: name of the course / field / domain
|
CaoHaiNam/12-01-2024-last-2000-row-QA | ---
dataset_info:
features:
- name: text
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 1129802
num_examples: 633
download_size: 535918
dataset_size: 1129802
---
# Dataset Card for "12-01-2024-last-2000-row-QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bogdansinik/embeddings | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.