datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Ejafa/GPT_4_with_ShareGPT | ---
license: other
---
|
isoleucin/fin-certificates | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 2147258.397905759
num_examples: 121
- name: validation
num_bytes: 550124.0523560209
num_examples: 31
- name: test
num_bytes: 692091.5497382199
num_examples: 39
download_size: 746938
dataset_size: 3389474.0
---
# Dataset Card for "fin-certificates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1 | ---
pretty_name: Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Felladrin/Smol-Llama-101M-Chat-v1](https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T12:30:02.788915](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1/blob/main/results_2024-03-03T12-30-02.788915.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24876164017048136,\n\
\ \"acc_stderr\": 0.03047984253451912,\n \"acc_norm\": 0.2496112873187987,\n\
\ \"acc_norm_stderr\": 0.03127796053175917,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.45756971268130436,\n\
\ \"mc2_stderr\": 0.015178620872901784\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18686006825938567,\n \"acc_stderr\": 0.011391015649694379,\n\
\ \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2751443935471022,\n\
\ \"acc_stderr\": 0.004456743108170734,\n \"acc_norm\": 0.2870942043417646,\n\
\ \"acc_norm_stderr\": 0.004514813363221152\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.33225806451612905,\n \"acc_stderr\": 0.02679556084812279,\n \"\
acc_norm\": 0.33225806451612905,\n \"acc_norm_stderr\": 0.02679556084812279\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.032396370467357015,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.032396370467357015\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.022211106810061672,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.022211106810061672\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303529,\n\
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.02758406660220826,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.02758406660220826\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.026453508054040325,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.026453508054040325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934102,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880585,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880585\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279338,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.28308823529411764,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.28308823529411764,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528047,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528047\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.17272727272727273,\n\
\ \"acc_stderr\": 0.036206918339292196,\n \"acc_norm\": 0.17272727272727273,\n\
\ \"acc_norm_stderr\": 0.036206918339292196\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.1890547263681592,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.45756971268130436,\n\
\ \"mc2_stderr\": 0.015178620872901784\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.500394632991318,\n \"acc_stderr\": 0.014052481306049512\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225172\n }\n}\n```"
repo_url: https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|arc:challenge|25_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|arc:challenge|25_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|gsm8k|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|gsm8k|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hellaswag|10_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hellaswag|10_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T12-30-02.788915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T12-30-02.788915.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- '**/details_harness|winogrande|5_2023-12-29T19-30-13.746850.parquet'
- split: 2024_03_03T12_30_02.788915
path:
- '**/details_harness|winogrande|5_2024-03-03T12-30-02.788915.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T12-30-02.788915.parquet'
- config_name: results
data_files:
- split: 2023_12_29T19_30_13.746850
path:
- results_2023-12-29T19-30-13.746850.parquet
- split: 2024_03_03T12_30_02.788915
path:
- results_2024-03-03T12-30-02.788915.parquet
- split: latest
path:
- results_2024-03-03T12-30-02.788915.parquet
---
# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/Smol-Llama-101M-Chat-v1](https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T12:30:02.788915](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1/blob/main/results_2024-03-03T12-30-02.788915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24876164017048136,
"acc_stderr": 0.03047984253451912,
"acc_norm": 0.2496112873187987,
"acc_norm_stderr": 0.03127796053175917,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.45756971268130436,
"mc2_stderr": 0.015178620872901784
},
"harness|arc:challenge|25": {
"acc": 0.18686006825938567,
"acc_stderr": 0.011391015649694379,
"acc_norm": 0.22866894197952217,
"acc_norm_stderr": 0.012272853582540795
},
"harness|hellaswag|10": {
"acc": 0.2751443935471022,
"acc_stderr": 0.004456743108170734,
"acc_norm": 0.2870942043417646,
"acc_norm_stderr": 0.004514813363221152
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33225806451612905,
"acc_stderr": 0.02679556084812279,
"acc_norm": 0.33225806451612905,
"acc_norm_stderr": 0.02679556084812279
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.032396370467357015,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.032396370467357015
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.022211106810061672,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.022211106810061672
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.02758406660220826,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.02758406660220826
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040325,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934102,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880585,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880585
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279338,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.28308823529411764,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.28308823529411764,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528047,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528047
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.17272727272727273,
"acc_stderr": 0.036206918339292196,
"acc_norm": 0.17272727272727273,
"acc_norm_stderr": 0.036206918339292196
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.45756971268130436,
"mc2_stderr": 0.015178620872901784
},
"harness|winogrande|5": {
"acc": 0.500394632991318,
"acc_stderr": 0.014052481306049512
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225172
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eroberto123/pablo | ---
license: unknown
---
|
jan-hq/athirdpath_roleplay_dpo_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 10769968.084889147
num_examples: 3085
- name: test
num_bytes: 1197438.9151108519
num_examples: 343
download_size: 4623261
dataset_size: 11967407.0
---
# Dataset Card for "athirdpath_roleplay_dpo_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mamiksik/annotated-diff-metadata | ---
dataset_info:
features:
- name: sha
dtype: string
- name: author
dtype: string
- name: committer
dtype: string
- name: message
dtype: string
- name: subject
dtype: string
- name: subject_length
dtype: float64
- name: is_chore
dtype: bool
- name: is_bot
dtype: bool
- name: subject_word_count
dtype: float64
- name: verb_object_spacy
dtype: bool
- name: verb_object_stanza
dtype: bool
- name: fits_requirements
dtype: bool
- name: owner
dtype: string
- name: repo
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 221089223
num_examples: 668743
download_size: 0
dataset_size: 221089223
---
# Dataset Card for "analysed-diff-metadata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/p90_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of p90/P90/P90 (Girls' Frontline)
This is the dataset of p90/P90/P90 (Girls' Frontline), containing 340 images and their tags.
The core tags of this character are `hair_bun, double_bun, red_eyes, short_hair, breasts, light_brown_hair, sunglasses, eyewear_on_head, bangs, medium_breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 340 | 613.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p90_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 340 | 298.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p90_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 901 | 684.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p90_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 340 | 514.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p90_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 901 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/p90_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p90_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 70 |  |  |  |  |  | p90, 1girl, solo, choker, black_gloves, holding_gun, smile, jacket, looking_at_viewer, thighhighs, snap-fit_buckle, gas_mask, open_mouth, blush, partially_fingerless_gloves, thigh_strap |
| 1 | 5 |  |  |  |  |  | 1girl, choker, gas_mask, jacket, looking_at_viewer, snap-fit_buckle, solo, black_gloves, smile, tactical_clothes, white_background, open_clothes, thighhighs, blush, fingerless_gloves, open_mouth, shirt, simple_background |
| 2 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, choker, smile, black_gloves, collarbone, jacket, blush, white_background, simple_background, partially_fingerless_gloves, open_mouth |
| 3 | 5 |  |  |  |  |  | 1girl, blush, earrings, official_alternate_costume, smile, solo, black_gloves, choker, coat, long_sleeves, closed_mouth, looking_at_viewer, black_footwear, brown_necktie, brown_shirt, full_body, open_clothes |
| 4 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, blush, high_heel_boots, looking_at_viewer, official_alternate_costume, red_bow, solo, thigh_boots, black_thighhighs, choker, collarbone, gas_mask, hair_bow, p90, smile, black_shorts, christmas, full_body, hairclip, open_mouth, star_hair_ornament, bell, fur_trim, long_sleeves, off_shoulder, red_coat, snap-fit_buckle, standing |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, large_breasts, nipples, open_mouth, penis, solo_focus, choker, sex, vaginal, cowgirl_position, cum_on_breasts, looking_at_viewer, official_alternate_costume, smile, bar_censor, black_gloves, cum_in_pussy, girl_on_top, hair_ornament, mosaic_censoring, navel, nude, partially_fingerless_gloves, pov, saliva, sweat, thighhighs, tongue |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | p90 | 1girl | solo | choker | black_gloves | holding_gun | smile | jacket | looking_at_viewer | thighhighs | snap-fit_buckle | gas_mask | open_mouth | blush | partially_fingerless_gloves | thigh_strap | tactical_clothes | white_background | open_clothes | fingerless_gloves | shirt | simple_background | collarbone | earrings | official_alternate_costume | coat | long_sleeves | closed_mouth | black_footwear | brown_necktie | brown_shirt | full_body | high_heel_boots | red_bow | thigh_boots | black_thighhighs | hair_bow | black_shorts | christmas | hairclip | star_hair_ornament | bell | fur_trim | off_shoulder | red_coat | standing | 1boy | hetero | large_breasts | nipples | penis | solo_focus | sex | vaginal | cowgirl_position | cum_on_breasts | bar_censor | cum_in_pussy | girl_on_top | hair_ornament | mosaic_censoring | navel | nude | pov | saliva | sweat | tongue |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------|:--------|:-------|:---------|:---------------|:--------------|:--------|:---------|:--------------------|:-------------|:------------------|:-----------|:-------------|:--------|:------------------------------|:--------------|:-------------------|:-------------------|:---------------|:--------------------|:--------|:--------------------|:-------------|:-----------|:-----------------------------|:-------|:---------------|:---------------|:-----------------|:----------------|:--------------|:------------|:------------------|:----------|:--------------|:-------------------|:-----------|:---------------|:------------|:-----------|:---------------------|:-------|:-----------|:---------------|:-----------|:-----------|:-------|:---------|:----------------|:----------|:--------|:-------------|:------|:----------|:-------------------|:-----------------|:-------------|:---------------|:--------------|:----------------|:-------------------|:--------|:-------|:------|:---------|:--------|:---------|
| 0 | 70 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | | X | X | X | X | | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 19 |  |  |  |  |  | | X | X | X | X | | X | X | X | | | | X | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | X | X | X | X | | X | | X | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | X | X | X | | | | | | | | | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | X | | X | X | | X | | X | X | | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
bluuebunny/arxiv_metadata_by_year | ---
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: versions
dtype: binary
- name: update_date
dtype: string
- name: authors_parsed
dtype: binary
splits:
- name: train
num_bytes: 353679994
num_examples: 208492
download_size: 231613351
dataset_size: 353679994
configs:
- config_name: default
data_files:
- split: train
path: data/*.parquet
license: apache-2.0
language:
- en
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vstenby/mat1-text-scraped | ---
license: mit
---
|
Lakera/autotrain-data-cancer-lakera | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: cancer-lakera
## Dataset Description
This dataset has been automatically processed by AutoTrain for project cancer-lakera.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<600x450 RGB PIL image>",
"feat_image_id": "ISIC_0024329",
"feat_lesion_id": "HAM_0002954",
"target": 0,
"feat_dx_type": "histo",
"feat_age": 75.0,
"feat_sex": "female",
"feat_localization": "lower extremity"
},
{
"image": "<600x450 RGB PIL image>",
"feat_image_id": "ISIC_0024372",
"feat_lesion_id": "HAM_0005389",
"target": 0,
"feat_dx_type": "histo",
"feat_age": 70.0,
"feat_sex": "male",
"feat_localization": "lower extremity"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"feat_image_id": "Value(dtype='string', id=None)",
"feat_lesion_id": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['actinic_keratoses', 'basal_cell_carcinoma', 'benign_keratosis-like_lesions'], id=None)",
"feat_dx_type": "Value(dtype='string', id=None)",
"feat_age": "Value(dtype='float64', id=None)",
"feat_sex": "Value(dtype='string', id=None)",
"feat_localization": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1200 |
| valid | 150 |
|
miragepa/A18 | ---
license: openrail
---
|
zelros/pj-maif | ---
tags:
- insurance
---
This dataset contains question/answer pairs from a French legal protection insurance (https://www.service-public.fr/particuliers/vosdroits/F3049?lang=en).
The objective of this dataset is to contribute to open source research projects aiming to, for instance:
* fine-tune LLMs on high-quality datasets, specializing them in the insurance domain
* develop new question/answer applications using Retrieval Augmented Generation (RAG) for insurance contracts
* assess the knowledge of language models in the insurance field
* more generally, apply LLMs to the insurance domain for better understanding and increased transparency of this industry.
Other datasets of the same kind are also available - or will be available soon - and are part of this research effort. See here: https://huggingface.co/collections/zelros/legal-protection-insurance-6536e8f389dd48faca78447e
Here is an example of usages of this dataset: https://huggingface.co/spaces/zelros/The-legal-protection-insurance-comparator |
ShenaoZ/ultrafeedback_subset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: test_prefs
num_bytes: 13161585
num_examples: 2000
- name: train_prefs
num_bytes: 131674010
num_examples: 20000
download_size: 80662994
dataset_size: 144835595
configs:
- config_name: default
data_files:
- split: test_prefs
path: data/test_prefs-*
- split: train_prefs
path: data/train_prefs-*
---
|
japneets/Alpaca_instruction_fine_tune_Punjabi | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 46649317
num_examples: 52002
download_size: 18652304
dataset_size: 46649317
---
# Dataset Card for "Alpaca_instruction_fine_tune_Punjabi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datablations/oscar-dedup-expanded | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: warc_headers
struct:
- name: warc-record-id
dtype: string
- name: warc-date
dtype: string
- name: content-type
dtype: string
- name: content-length
dtype: int32
- name: warc-type
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-block-digest
dtype: string
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float32
- name: annotations
sequence: string
- name: line_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float32
- name: perplexity_score
dtype: float64
- name: text_length
dtype: int64
- name: url
dtype: string
- name: domain
dtype: string
- name: dup_ratio
dtype: float64
- name: pairs
sequence:
sequence: int64
- name: repetitions
sequence: binary
- name: included_in_dedup
dtype: bool
- name: cluster
sequence: int64
- name: has_dup_25
dtype: bool
splits:
- name: train
num_bytes: 3188540880787
num_examples: 431992659
download_size: 1732364041898
dataset_size: 3188540880787
---
Use the 25% suffix array to deduplicate the full Oscar, i.e. remove any document which has an at least 100-char span overlapping with the 25% chunk we selected in the previous bullet. This is more permissive and leaves us with 136 million documents or 31% of the original dataset. Also for reasons the explanation of which would probably involve terms like power laws, we still remove most of the most pervasive duplicates - so I'm pretty optimistic about this being useful.
|
open-llm-leaderboard/details_Joseph717171__Mistral-10.7B-v0.2 | ---
pretty_name: Evaluation run of Joseph717171/Mistral-10.7B-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Joseph717171/Mistral-10.7B-v0.2](https://huggingface.co/Joseph717171/Mistral-10.7B-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Joseph717171__Mistral-10.7B-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T15:53:37.777650](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Mistral-10.7B-v0.2/blob/main/results_2024-03-30T15-53-37.777650.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6290253706195131,\n\
\ \"acc_stderr\": 0.03256728907548255,\n \"acc_norm\": 0.6364881368686879,\n\
\ \"acc_norm_stderr\": 0.033245641725540745,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520684,\n \"mc2\": 0.40394087413724206,\n\
\ \"mc2_stderr\": 0.01428909136251767\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n\
\ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n\
\ \"acc_stderr\": 0.004868341056566223,\n \"acc_norm\": 0.8092013543118901,\n\
\ \"acc_norm_stderr\": 0.003921276446819983\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334395,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334395\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n\
\ \"acc_stderr\": 0.01639222189940706,\n \"acc_norm\": 0.4011173184357542,\n\
\ \"acc_norm_stderr\": 0.01639222189940706\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701758,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701758\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249776,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249776\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520684,\n \"mc2\": 0.40394087413724206,\n\
\ \"mc2_stderr\": 0.01428909136251767\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698336\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27369219105382864,\n \
\ \"acc_stderr\": 0.01228100349096345\n }\n}\n```"
repo_url: https://huggingface.co/Joseph717171/Mistral-10.7B-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-53-37.777650.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-53-37.777650.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- '**/details_harness|winogrande|5_2024-03-30T15-53-37.777650.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T15-53-37.777650.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_53_37.777650
path:
- results_2024-03-30T15-53-37.777650.parquet
- split: latest
path:
- results_2024-03-30T15-53-37.777650.parquet
---
# Dataset Card for Evaluation run of Joseph717171/Mistral-10.7B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Joseph717171/Mistral-10.7B-v0.2](https://huggingface.co/Joseph717171/Mistral-10.7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Joseph717171__Mistral-10.7B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T15:53:37.777650](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Mistral-10.7B-v0.2/blob/main/results_2024-03-30T15-53-37.777650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6290253706195131,
"acc_stderr": 0.03256728907548255,
"acc_norm": 0.6364881368686879,
"acc_norm_stderr": 0.033245641725540745,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520684,
"mc2": 0.40394087413724206,
"mc2_stderr": 0.01428909136251767
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403084
},
"harness|hellaswag|10": {
"acc": 0.6096395140410277,
"acc_stderr": 0.004868341056566223,
"acc_norm": 0.8092013543118901,
"acc_norm_stderr": 0.003921276446819983
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334395,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940706,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940706
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701758,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701758
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249776,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249776
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520684,
"mc2": 0.40394087413724206,
"mc2_stderr": 0.01428909136251767
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698336
},
"harness|gsm8k|5": {
"acc": 0.27369219105382864,
"acc_stderr": 0.01228100349096345
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
imvladikon/tatoeba_heb | ---
dataset_info:
features:
- name: id
dtype: string
- name: eng
dtype: string
- name: heb_translated
dtype: string
- name: heb
dtype: string
- name: labse_score
dtype: float64
- name: e5_score
dtype: float64
- name: heb_neg
dtype: string
splits:
- name: train
num_bytes: 30640830.885270394
num_examples: 151329
download_size: 14615351
dataset_size: 30640830.885270394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tatoeba_heb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Sample
```json
{'id': '1125',
'eng': 'All of our blood goes through our kidneys about sixty times a day.',
'heb_translated': 'כל הדם שלנו עובר דרך הכליות שלנו כשישים פעמים ביום.',
'heb': 'כל הדם שלנו עובר דרך הכליות כשישים פעם ביום.',
'labse_score': 0.9927536845207214,
'e5_score': 0.9980219602584839,
'heb_neg': 'אבל כל הזמן אנחנו משתמשים במים כדי לקבל את המזון שלנו.'}
``` |
albertcc/test | ---
license: mit
---
|
open-llm-leaderboard/details_alchemonaut__BoreanGale-70B | ---
pretty_name: Evaluation run of alchemonaut/BoreanGale-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alchemonaut/BoreanGale-70B](https://huggingface.co/alchemonaut/BoreanGale-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alchemonaut__BoreanGale-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T23:15:05.818053](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__BoreanGale-70B/blob/main/results_2024-02-02T23-15-05.818053.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7504730019239859,\n\
\ \"acc_stderr\": 0.028717616307233827,\n \"acc_norm\": 0.7540443972841604,\n\
\ \"acc_norm_stderr\": 0.029263680905302243,\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975453,\n \"mc2\": 0.6859618221240749,\n\
\ \"mc2_stderr\": 0.014566147300959674\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623504,\n\
\ \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n\
\ \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.8937462656841266,\n\
\ \"acc_norm_stderr\": 0.003075323010408428\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891366,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n\
\ \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \
\ \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n\
\ \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5238095238095238,\n \"acc_stderr\": 0.02572209706438851,\n \"\
acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.02572209706438851\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432302,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865397,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865397\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853102,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853102\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.01742697415424053,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.01742697415424053\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227627,\n \
\ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227627\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \
\ \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"\
acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.030546745264953178,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.030546745264953178\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.919831223628692,\n \"acc_stderr\": 0.01767667999189164,\n \
\ \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.01767667999189164\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073903,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073903\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.0328818027880863,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.0328818027880863\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8863346104725415,\n\
\ \"acc_stderr\": 0.011350359050566023,\n \"acc_norm\": 0.8863346104725415,\n\
\ \"acc_norm_stderr\": 0.011350359050566023\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490717,\n\
\ \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490717\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.623463687150838,\n\
\ \"acc_stderr\": 0.016204672385106606,\n \"acc_norm\": 0.623463687150838,\n\
\ \"acc_norm_stderr\": 0.016204672385106606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.021986032182064148,\n\
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.021986032182064148\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n\
\ \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n\
\ \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438287,\n\
\ \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5921985815602837,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5990873533246415,\n\
\ \"acc_stderr\": 0.012516960350640816,\n \"acc_norm\": 0.5990873533246415,\n\
\ \"acc_norm_stderr\": 0.012516960350640816\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02388688192244033,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02388688192244033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n\
\ \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n\
\ \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.02190429135575904,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.02190429135575904\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5263157894736842,\n\
\ \"mc1_stderr\": 0.017479241161975453,\n \"mc2\": 0.6859618221240749,\n\
\ \"mc2_stderr\": 0.014566147300959674\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433544\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \
\ \"acc_stderr\": 0.012919408108656424\n }\n}\n```"
repo_url: https://huggingface.co/alchemonaut/BoreanGale-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|arc:challenge|25_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|gsm8k|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hellaswag|10_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T23-15-05.818053.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- '**/details_harness|winogrande|5_2024-02-02T23-15-05.818053.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T23-15-05.818053.parquet'
- config_name: results
data_files:
- split: 2024_02_02T23_15_05.818053
path:
- results_2024-02-02T23-15-05.818053.parquet
- split: latest
path:
- results_2024-02-02T23-15-05.818053.parquet
---
# Dataset Card for Evaluation run of alchemonaut/BoreanGale-70B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alchemonaut/BoreanGale-70B](https://huggingface.co/alchemonaut/BoreanGale-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alchemonaut__BoreanGale-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T23:15:05.818053](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__BoreanGale-70B/blob/main/results_2024-02-02T23-15-05.818053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7504730019239859,
"acc_stderr": 0.028717616307233827,
"acc_norm": 0.7540443972841604,
"acc_norm_stderr": 0.029263680905302243,
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975453,
"mc2": 0.6859618221240749,
"mc2_stderr": 0.014566147300959674
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623504,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473848
},
"harness|hellaswag|10": {
"acc": 0.717486556462856,
"acc_stderr": 0.004493015945599716,
"acc_norm": 0.8937462656841266,
"acc_norm_stderr": 0.003075323010408428
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.02572209706438851,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.02572209706438851
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865397,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865397
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853102,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853102
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.01742697415424053,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.01742697415424053
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227627,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.030546745264953178,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.030546745264953178
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.01767667999189164,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.01767667999189164
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073903,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073903
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6875,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.0328818027880863,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.0328818027880863
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8863346104725415,
"acc_stderr": 0.011350359050566023,
"acc_norm": 0.8863346104725415,
"acc_norm_stderr": 0.011350359050566023
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490717,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490717
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.623463687150838,
"acc_stderr": 0.016204672385106606,
"acc_norm": 0.623463687150838,
"acc_norm_stderr": 0.016204672385106606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.021986032182064148,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.021986032182064148
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438287,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5990873533246415,
"acc_stderr": 0.012516960350640816,
"acc_norm": 0.5990873533246415,
"acc_norm_stderr": 0.012516960350640816
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02388688192244033,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02388688192244033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262554,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.02190429135575904,
"acc_norm": 0.95,
"acc_norm_stderr": 0.02190429135575904
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975453,
"mc2": 0.6859618221240749,
"mc2_stderr": 0.014566147300959674
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433544
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.012919408108656424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Technoculture__Mediquad-4x7b | ---
pretty_name: Evaluation run of Technoculture/Mediquad-4x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/Mediquad-4x7b](https://huggingface.co/Technoculture/Mediquad-4x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Mediquad-4x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T08:24:28.609699](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Mediquad-4x7b/blob/main/results_2024-01-16T08-24-28.609699.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28390168620155787,\n\
\ \"acc_stderr\": 0.031652775276322986,\n \"acc_norm\": 0.2863068103768225,\n\
\ \"acc_norm_stderr\": 0.032507978612879115,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.49560252838701496,\n\
\ \"mc2_stderr\": 0.016847234757977527\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.011918271754852165,\n\
\ \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26687910774746065,\n\
\ \"acc_stderr\": 0.004414246720076111,\n \"acc_norm\": 0.28211511651065524,\n\
\ \"acc_norm_stderr\": 0.004491093528113431\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29797979797979796,\n \"acc_stderr\": 0.03258630383836556,\n \"\
acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.03258630383836556\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635474,\n\
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635474\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02865749128507198,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02865749128507198\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26422018348623855,\n \"acc_stderr\": 0.01890416417151019,\n \"\
acc_norm\": 0.26422018348623855,\n \"acc_norm_stderr\": 0.01890416417151019\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3627450980392157,\n\
\ \"acc_stderr\": 0.03374499356319355,\n \"acc_norm\": 0.3627450980392157,\n\
\ \"acc_norm_stderr\": 0.03374499356319355\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n\
\ \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15246636771300448,\n\
\ \"acc_stderr\": 0.024126204813252877,\n \"acc_norm\": 0.15246636771300448,\n\
\ \"acc_norm_stderr\": 0.024126204813252877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.39316239316239315,\n\
\ \"acc_stderr\": 0.03199957924651047,\n \"acc_norm\": 0.39316239316239315,\n\
\ \"acc_norm_stderr\": 0.03199957924651047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455763,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455763\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n\
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705477,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2861799217731421,\n\
\ \"acc_stderr\": 0.011543642878150755,\n \"acc_norm\": 0.2861799217731421,\n\
\ \"acc_norm_stderr\": 0.011543642878150755\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.018120224251484587,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.018120224251484587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.18128654970760233,\n \"acc_stderr\": 0.029547741687640024,\n\
\ \"acc_norm\": 0.18128654970760233,\n \"acc_norm_stderr\": 0.029547741687640024\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.49560252838701496,\n\
\ \"mc2_stderr\": 0.016847234757977527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.01405174596179052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/Mediquad-4x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|arc:challenge|25_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|gsm8k|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hellaswag|10_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T08-24-28.609699.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- '**/details_harness|winogrande|5_2024-01-16T08-24-28.609699.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T08-24-28.609699.parquet'
- config_name: results
data_files:
- split: 2024_01_16T08_24_28.609699
path:
- results_2024-01-16T08-24-28.609699.parquet
- split: latest
path:
- results_2024-01-16T08-24-28.609699.parquet
---
# Dataset Card for Evaluation run of Technoculture/Mediquad-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Mediquad-4x7b](https://huggingface.co/Technoculture/Mediquad-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Mediquad-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T08:24:28.609699](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Mediquad-4x7b/blob/main/results_2024-01-16T08-24-28.609699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28390168620155787,
"acc_stderr": 0.031652775276322986,
"acc_norm": 0.2863068103768225,
"acc_norm_stderr": 0.032507978612879115,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.49560252838701496,
"mc2_stderr": 0.016847234757977527
},
"harness|arc:challenge|25": {
"acc": 0.21075085324232082,
"acc_stderr": 0.011918271754852165,
"acc_norm": 0.27474402730375425,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.26687910774746065,
"acc_stderr": 0.004414246720076111,
"acc_norm": 0.28211511651065524,
"acc_norm_stderr": 0.004491093528113431
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02865749128507198,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02865749128507198
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26422018348623855,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.26422018348623855,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15246636771300448,
"acc_stderr": 0.024126204813252877,
"acc_norm": 0.15246636771300448,
"acc_norm_stderr": 0.024126204813252877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.39316239316239315,
"acc_stderr": 0.03199957924651047,
"acc_norm": 0.39316239316239315,
"acc_norm_stderr": 0.03199957924651047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455763,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455763
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409153,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409153
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.02600480036395211,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.02600480036395211
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705477,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2861799217731421,
"acc_stderr": 0.011543642878150755,
"acc_norm": 0.2861799217731421,
"acc_norm_stderr": 0.011543642878150755
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.018120224251484587,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.018120224251484587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.18128654970760233,
"acc_stderr": 0.029547741687640024,
"acc_norm": 0.18128654970760233,
"acc_norm_stderr": 0.029547741687640024
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.49560252838701496,
"mc2_stderr": 0.016847234757977527
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.01405174596179052
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-7b-pretrain
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-7b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T22:23:31.822890](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain/blob/main/results_2023-10-24T22-23-31.822890.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335575,\n \"f1\": 0.0473752097315439,\n\
\ \"f1_stderr\": 0.0011829405023092946,\n \"acc\": 0.36752358971812016,\n\
\ \"acc_stderr\": 0.008870377138277116\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335575,\n\
\ \"f1\": 0.0473752097315439,\n \"f1_stderr\": 0.0011829405023092946\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.032600454890068235,\n \
\ \"acc_stderr\": 0.004891669021939581\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614652\n\
\ }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T22_23_31.822890
path:
- '**/details_harness|drop|3_2023-10-24T22-23-31.822890.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T22-23-31.822890.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T22_23_31.822890
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-23-31.822890.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-23-31.822890.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-29-37.507273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T15-29-37.507273.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T22_23_31.822890
path:
- '**/details_harness|winogrande|5_2023-10-24T22-23-31.822890.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T22-23-31.822890.parquet'
- config_name: results
data_files:
- split: 2023_09_11T15_29_37.507273
path:
- results_2023-09-11T15-29-37.507273.parquet
- split: 2023_10_24T22_23_31.822890
path:
- results_2023-10-24T22-23-31.822890.parquet
- split: latest
path:
- results_2023-10-24T22-23-31.822890.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-pretrain
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-7b-pretrain](https://huggingface.co/YeungNLP/firefly-llama2-7b-pretrain) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T22:23:31.822890](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-pretrain/blob/main/results_2023-10-24T22-23-31.822890.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335575,
"f1": 0.0473752097315439,
"f1_stderr": 0.0011829405023092946,
"acc": 0.36752358971812016,
"acc_stderr": 0.008870377138277116
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335575,
"f1": 0.0473752097315439,
"f1_stderr": 0.0011829405023092946
},
"harness|gsm8k|5": {
"acc": 0.032600454890068235,
"acc_stderr": 0.004891669021939581
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614652
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
redwoodresearch/diamonds-seed1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: is_correct
dtype: bool
- name: is_clean
dtype: bool
- name: measurements
sequence: bool
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 62363093
num_examples: 25000
- name: validation
num_bytes: 20139849
num_examples: 7989
- name: train_for_val
num_bytes: 7545690
num_examples: 2997
download_size: 1101298
dataset_size: 90048632
---
# Dataset Card for "diamonds-seed1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Qwen__Qwen2-beta-14B | ---
pretty_name: Evaluation run of Qwen/Qwen2-beta-14B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen2-beta-14B](https://huggingface.co/Qwen/Qwen2-beta-14B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen2-beta-14B_private\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2024-01-30T11:39:41.356084](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen2-beta-14B_private/blob/main/results_2024-01-30T11-39-41.356084.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6762699014404853,\n\
\ \"acc_stderr\": 0.012888247397371141\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.6762699014404853,\n \"acc_stderr\": 0.012888247397371141\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen2-beta-14B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|arc:challenge|25_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|gsm8k|5_2024-01-29T15-38-03.755073.parquet'
- split: 2024_01_30T11_39_41.356084
path:
- '**/details_harness|gsm8k|5_2024-01-30T11-39-41.356084.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-30T11-39-41.356084.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hellaswag|10_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T15-38-03.755073.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T15-38-03.755073.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- '**/details_harness|winogrande|5_2024-01-29T15-38-03.755073.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T15-38-03.755073.parquet'
- config_name: results
data_files:
- split: 2024_01_29T15_38_03.755073
path:
- results_2024-01-29T15-38-03.755073.parquet
- split: 2024_01_30T11_39_41.356084
path:
- results_2024-01-30T11-39-41.356084.parquet
- split: latest
path:
- results_2024-01-30T11-39-41.356084.parquet
---
# Dataset Card for Evaluation run of Qwen/Qwen2-beta-14B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Qwen/Qwen2-beta-14B](https://huggingface.co/Qwen/Qwen2-beta-14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen2-beta-14B_private",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-30T11:39:41.356084](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen2-beta-14B_private/blob/main/results_2024-01-30T11-39-41.356084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371141
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta | ---
pretty_name: Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T07:34:28.878740](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-34-28.878740.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5960695400984635,\n\
\ \"acc_stderr\": 0.03323664542144247,\n \"acc_norm\": 0.6045531326002294,\n\
\ \"acc_norm_stderr\": 0.03395162360261904,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5235673204188883,\n\
\ \"mc2_stderr\": 0.016399852429558985\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171867,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6106353316072496,\n\
\ \"acc_stderr\": 0.0048660968809414425,\n \"acc_norm\": 0.7982473610834495,\n\
\ \"acc_norm_stderr\": 0.004004883380078933\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.02496268356433179,\n \
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.02496268356433179\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333564,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811942,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811942\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.0279626776047689,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.0279626776047689\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5235673204188883,\n\
\ \"mc2_stderr\": 0.016399852429558985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893012\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \
\ \"acc_stderr\": 0.01081234728318298\n }\n}\n```"
repo_url: https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|arc:challenge|25_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|gsm8k|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hellaswag|10_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T07-34-28.878740.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- '**/details_harness|winogrande|5_2024-02-13T07-34-28.878740.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T07-34-28.878740.parquet'
- config_name: results
data_files:
- split: 2024_02_13T07_34_28.878740
path:
- results_2024-02-13T07-34-28.878740.parquet
- split: latest
path:
- results_2024-02-13T07-34-28.878740.parquet
---
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T07:34:28.878740](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-DPO-MixQV2-SOLIDRejected-SFTChosen-Zephyr-7b-beta/blob/main/results_2024-02-13T07-34-28.878740.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5960695400984635,
"acc_stderr": 0.03323664542144247,
"acc_norm": 0.6045531326002294,
"acc_norm_stderr": 0.03395162360261904,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5235673204188883,
"mc2_stderr": 0.016399852429558985
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171867,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6106353316072496,
"acc_stderr": 0.0048660968809414425,
"acc_norm": 0.7982473610834495,
"acc_norm_stderr": 0.004004883380078933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.02496268356433179,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.02496268356433179
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333564,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811942,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811942
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.0279626776047689,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.0279626776047689
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5235673204188883,
"mc2_stderr": 0.016399852429558985
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893012
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.01081234728318298
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_sequelbox__StellarBright | ---
pretty_name: Evaluation run of sequelbox/StellarBright
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sequelbox/StellarBright](https://huggingface.co/sequelbox/StellarBright) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__StellarBright_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T22:55:36.010619](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__StellarBright_public/blob/main/results_2023-11-08T22-55-36.010619.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34458892617449666,\n\
\ \"em_stderr\": 0.004866841438021566,\n \"f1\": 0.4966107382550379,\n\
\ \"f1_stderr\": 0.004389897684698882,\n \"acc\": 0.613835910465284,\n\
\ \"acc_stderr\": 0.011977981888400647\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.34458892617449666,\n \"em_stderr\": 0.004866841438021566,\n\
\ \"f1\": 0.4966107382550379,\n \"f1_stderr\": 0.004389897684698882\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3949962092494314,\n \
\ \"acc_stderr\": 0.01346535496997321\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828082\n\
\ }\n}\n```"
repo_url: https://huggingface.co/sequelbox/StellarBright
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T22_55_36.010619
path:
- '**/details_harness|drop|3_2023-11-08T22-55-36.010619.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T22-55-36.010619.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T22_55_36.010619
path:
- '**/details_harness|gsm8k|5_2023-11-08T22-55-36.010619.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T22-55-36.010619.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T22_55_36.010619
path:
- '**/details_harness|winogrande|5_2023-11-08T22-55-36.010619.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T22-55-36.010619.parquet'
- config_name: results
data_files:
- split: 2023_11_08T22_55_36.010619
path:
- results_2023-11-08T22-55-36.010619.parquet
- split: latest
path:
- results_2023-11-08T22-55-36.010619.parquet
---
# Dataset Card for Evaluation run of sequelbox/StellarBright
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/StellarBright
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/StellarBright](https://huggingface.co/sequelbox/StellarBright) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__StellarBright_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T22:55:36.010619](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__StellarBright_public/blob/main/results_2023-11-08T22-55-36.010619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34458892617449666,
"em_stderr": 0.004866841438021566,
"f1": 0.4966107382550379,
"f1_stderr": 0.004389897684698882,
"acc": 0.613835910465284,
"acc_stderr": 0.011977981888400647
},
"harness|drop|3": {
"em": 0.34458892617449666,
"em_stderr": 0.004866841438021566,
"f1": 0.4966107382550379,
"f1_stderr": 0.004389897684698882
},
"harness|gsm8k|5": {
"acc": 0.3949962092494314,
"acc_stderr": 0.01346535496997321
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828082
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Euclid/chammuu | ---
license: other
---
|
open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-2 | ---
pretty_name: Evaluation run of juhwanlee/gemma-7B-alpaca-case-3-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/gemma-7B-alpaca-case-3-2](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:08:32.071039](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-2/blob/main/results_2024-03-27T18-08-32.071039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2813608182150032,\n\
\ \"acc_stderr\": 0.03158116455716802,\n \"acc_norm\": 0.2831474632466192,\n\
\ \"acc_norm_stderr\": 0.03242830518837846,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752323,\n \"mc2\": 0.40993581342319196,\n\
\ \"mc2_stderr\": 0.0158234411590989\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29692832764505117,\n \"acc_stderr\": 0.013352025976725225,\n\
\ \"acc_norm\": 0.33276450511945393,\n \"acc_norm_stderr\": 0.013769863046192305\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38099980083648677,\n\
\ \"acc_stderr\": 0.004846400325585234,\n \"acc_norm\": 0.4924317864967138,\n\
\ \"acc_norm_stderr\": 0.004989209770743233\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292947,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292947\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756777,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756777\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.022352193737453268,\n\
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.022352193737453268\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267613,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267613\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.03006676158297794,\n\
\ \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30825688073394497,\n \"acc_stderr\": 0.019798366698367268,\n \"\
acc_norm\": 0.30825688073394497,\n \"acc_norm_stderr\": 0.019798366698367268\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395593,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.36752136752136755,\n\
\ \"acc_stderr\": 0.03158539157745636,\n \"acc_norm\": 0.36752136752136755,\n\
\ \"acc_norm_stderr\": 0.03158539157745636\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.36909323116219667,\n\
\ \"acc_stderr\": 0.017256283109124606,\n \"acc_norm\": 0.36909323116219667,\n\
\ \"acc_norm_stderr\": 0.017256283109124606\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.0254038329781796,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.0254038329781796\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053738,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053738\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.027257202606114955,\n\
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.027257202606114955\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4327485380116959,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.4327485380116959,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752323,\n \"mc2\": 0.40993581342319196,\n\
\ \"mc2_stderr\": 0.0158234411590989\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6045777426992897,\n \"acc_stderr\": 0.013741678387545343\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-08-32.071039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-08-32.071039.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- '**/details_harness|winogrande|5_2024-03-27T18-08-32.071039.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-08-32.071039.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_08_32.071039
path:
- results_2024-03-27T18-08-32.071039.parquet
- split: latest
path:
- results_2024-03-27T18-08-32.071039.parquet
---
# Dataset Card for Evaluation run of juhwanlee/gemma-7B-alpaca-case-3-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/gemma-7B-alpaca-case-3-2](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-3-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:08:32.071039](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-3-2/blob/main/results_2024-03-27T18-08-32.071039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2813608182150032,
"acc_stderr": 0.03158116455716802,
"acc_norm": 0.2831474632466192,
"acc_norm_stderr": 0.03242830518837846,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.40993581342319196,
"mc2_stderr": 0.0158234411590989
},
"harness|arc:challenge|25": {
"acc": 0.29692832764505117,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.33276450511945393,
"acc_norm_stderr": 0.013769863046192305
},
"harness|hellaswag|10": {
"acc": 0.38099980083648677,
"acc_stderr": 0.004846400325585234,
"acc_norm": 0.4924317864967138,
"acc_norm_stderr": 0.004989209770743233
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292947,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292947
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756777,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756777
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.022352193737453268,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.022352193737453268
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267613,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267613
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30825688073394497,
"acc_stderr": 0.019798366698367268,
"acc_norm": 0.30825688073394497,
"acc_norm_stderr": 0.019798366698367268
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.36752136752136755,
"acc_stderr": 0.03158539157745636,
"acc_norm": 0.36752136752136755,
"acc_norm_stderr": 0.03158539157745636
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.36909323116219667,
"acc_stderr": 0.017256283109124606,
"acc_norm": 0.36909323116219667,
"acc_norm_stderr": 0.017256283109124606
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.0254038329781796,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.0254038329781796
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053738,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053738
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.027257202606114955,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.027257202606114955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4327485380116959,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.4327485380116959,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.40993581342319196,
"mc2_stderr": 0.0158234411590989
},
"harness|winogrande|5": {
"acc": 0.6045777426992897,
"acc_stderr": 0.013741678387545343
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jing24/generate_sub_4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 42500621
num_examples: 46640
download_size: 0
dataset_size: 42500621
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KayoSilva88777/Letis_Go | ---
license: openrail
---
|
cp500/radiology-samples | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 105035647
num_examples: 135466
- name: test
num_bytes: 26470297
num_examples: 33869
download_size: 54294813
dataset_size: 131505944
---
# Dataset Card for "radiology-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_llm-agents__tora-13b-v1.0 | ---
pretty_name: Evaluation run of llm-agents/tora-13b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-13b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T23:02:54.555967](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2024-01-04T23-02-54.555967.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5454718360054382,\n\
\ \"acc_stderr\": 0.033711742991346105,\n \"acc_norm\": 0.5513853830318955,\n\
\ \"acc_norm_stderr\": 0.034440663087187254,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4022040509062026,\n\
\ \"mc2_stderr\": 0.01499246594878979\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348897,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6361282613025294,\n\
\ \"acc_stderr\": 0.004801290954387086,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.0038076803311729033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.038124005659748335,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.038124005659748335\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n\
\ \"acc_stderr\": 0.044405219061793275,\n \"acc_norm\": 0.27450980392156865,\n\
\ \"acc_norm_stderr\": 0.044405219061793275\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.40425531914893614,\n \"acc_stderr\": 0.032081157507886836,\n \"\
acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.02531063925493389,\n \
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.02531063925493389\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\
\ \"acc_stderr\": 0.015745497169049053,\n \"acc_norm\": 0.7369093231162197,\n\
\ \"acc_norm_stderr\": 0.015745497169049053\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016134,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761976,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761976\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647012,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.409387222946545,\n\
\ \"acc_stderr\": 0.012558780895570752,\n \"acc_norm\": 0.409387222946545,\n\
\ \"acc_norm_stderr\": 0.012558780895570752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n\
\ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4022040509062026,\n\
\ \"mc2_stderr\": 0.01499246594878979\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437521\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20773313115996966,\n \
\ \"acc_stderr\": 0.011174572716705902\n }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-13b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|arc:challenge|25_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|drop|3_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|drop|3_2023-10-29T07-05-06.186132.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T07-05-06.186132.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|gsm8k|5_2023-10-29T07-05-06.186132.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|gsm8k|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hellaswag|10_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-02-54.555967.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-17-02.134278.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T23-02-54.555967.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|winogrande|5_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|winogrande|5_2023-10-29T07-05-06.186132.parquet'
- split: 2024_01_04T23_02_54.555967
path:
- '**/details_harness|winogrande|5_2024-01-04T23-02-54.555967.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T23-02-54.555967.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- results_2023-10-10T15-17-02.134278.parquet
- split: 2023_10_29T06_57_18.434824
path:
- results_2023-10-29T06-57-18.434824.parquet
- split: 2023_10_29T07_05_06.186132
path:
- results_2023-10-29T07-05-06.186132.parquet
- split: 2024_01_04T23_02_54.555967
path:
- results_2024-01-04T23-02-54.555967.parquet
- split: latest
path:
- results_2024-01-04T23-02-54.555967.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-13b-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-13b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T23:02:54.555967](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2024-01-04T23-02-54.555967.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5454718360054382,
"acc_stderr": 0.033711742991346105,
"acc_norm": 0.5513853830318955,
"acc_norm_stderr": 0.034440663087187254,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4022040509062026,
"mc2_stderr": 0.01499246594878979
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348897,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642664
},
"harness|hellaswag|10": {
"acc": 0.6361282613025294,
"acc_stderr": 0.004801290954387086,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.0038076803311729033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.02531063925493389,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.02531063925493389
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619627,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7369093231162197,
"acc_stderr": 0.015745497169049053,
"acc_norm": 0.7369093231162197,
"acc_norm_stderr": 0.015745497169049053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016134,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761976,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761976
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647012,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.409387222946545,
"acc_stderr": 0.012558780895570752,
"acc_norm": 0.409387222946545,
"acc_norm_stderr": 0.012558780895570752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5755102040816327,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.5755102040816327,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4022040509062026,
"mc2_stderr": 0.01499246594878979
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437521
},
"harness|gsm8k|5": {
"acc": 0.20773313115996966,
"acc_stderr": 0.011174572716705902
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
McSpicyWithMilo/target-locations-0.3split | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: target_location
dtype: string
- name: instruction_type
dtype: string
splits:
- name: train
num_bytes: 6754.825
num_examples: 70
- name: test
num_bytes: 2894.925
num_examples: 30
download_size: 11251
dataset_size: 9649.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-locations-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
version-control/the-stack-ds-lib-10k | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: hexsha
sequence: string
- name: file_path
sequence: string
- name: code
sequence: string
- name: apis
sequence:
sequence: string
splits:
- name: train
num_bytes: 124424438
num_examples: 9374
download_size: 44876714
dataset_size: 124424438
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jessiecs/llama-2-7b-a3-openassistant-guanaco-backward | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12796158
num_examples: 9846
download_size: 7059359
dataset_size: 12796158
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xquad_r | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ar
- de
- el
- en
- es
- hi
- ru
- th
- tr
- vi
- zh
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|squad
- extended|xquad
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: xquad-r
pretty_name: LAReQA
config_names:
- ar
- de
- el
- en
- es
- hi
- ru
- th
- tr
- vi
- zh
dataset_info:
- config_name: ar
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1722775
num_examples: 1190
download_size: 263002
dataset_size: 1722775
- config_name: de
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1283277
num_examples: 1190
download_size: 241957
dataset_size: 1283277
- config_name: el
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 2206666
num_examples: 1190
download_size: 324379
dataset_size: 2206666
- config_name: en
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1116099
num_examples: 1190
download_size: 212372
dataset_size: 1116099
- config_name: es
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1273475
num_examples: 1190
download_size: 236874
dataset_size: 1273475
- config_name: hi
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 2682951
num_examples: 1190
download_size: 322083
dataset_size: 2682951
- config_name: ru
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 2136966
num_examples: 1190
download_size: 321728
dataset_size: 2136966
- config_name: th
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 2854935
num_examples: 1190
download_size: 337307
dataset_size: 2854935
- config_name: tr
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1210739
num_examples: 1190
download_size: 228364
dataset_size: 1210739
- config_name: vi
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 1477215
num_examples: 1190
download_size: 237644
dataset_size: 1477215
- config_name: zh
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 984217
num_examples: 1190
download_size: 205768
dataset_size: 984217
configs:
- config_name: ar
data_files:
- split: validation
path: ar/validation-*
- config_name: de
data_files:
- split: validation
path: de/validation-*
- config_name: el
data_files:
- split: validation
path: el/validation-*
- config_name: en
data_files:
- split: validation
path: en/validation-*
- config_name: es
data_files:
- split: validation
path: es/validation-*
- config_name: hi
data_files:
- split: validation
path: hi/validation-*
- config_name: ru
data_files:
- split: validation
path: ru/validation-*
- config_name: th
data_files:
- split: validation
path: th/validation-*
- config_name: tr
data_files:
- split: validation
path: tr/validation-*
- config_name: vi
data_files:
- split: validation
path: vi/validation-*
- config_name: zh
data_files:
- split: validation
path: zh/validation-*
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [LAReQA](https://github.com/google-research-datasets/lareqa)
- **Repository:** [XQuAD-R](https://github.com/google-research-datasets/lareqa)
- **Paper:** [LAReQA: Language-agnostic answer retrieval from a multilingual pool](https://arxiv.org/pdf/2004.05484.pdf)
- **Point of Contact:** [Noah Constant](mailto:nconstant@google.com)
### Dataset Summary
XQuAD-R is a retrieval version of the XQuAD dataset (a cross-lingual extractive
QA dataset). Like XQuAD, XQUAD-R is an 11-way parallel dataset, where each
question appears in 11 different languages and has 11 parallel correct answers
across the languages.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset can be found with the following languages:
* Arabic: `xquad-r/ar.json`
* German: `xquad-r/de.json`
* Greek: `xquad-r/el.json`
* English: `xquad-r/en.json`
* Spanish: `xquad-r/es.json`
* Hindi: `xquad-r/hi.json`
* Russian: `xquad-r/ru.json`
* Thai: `xquad-r/th.json`
* Turkish: `xquad-r/tr.json`
* Vietnamese: `xquad-r/vi.json`
* Chinese: `xquad-r/zh.json`
## Dataset Structure
[More Information Needed]
### Data Instances
An example from `en` config:
```
{'id': '56beb4343aeaaa14008c925b',
'context': "The Panthers defense gave up just 308 points, ranking sixth in the league, while also leading the NFL in interceptions with 24 and boasting four Pro Bowl selections. Pro Bowl defensive tackle Kawann Short led the team in sacks with 11, while also forcing three fumbles and recovering two. Fellow lineman Mario Addison added 6½ sacks. The Panthers line also featured veteran defensive end Jared Allen, a 5-time pro bowler who was the NFL's active career sack leader with 136, along with defensive end Kony Ealy, who had 5 sacks in just 9 starts. Behind them, two of the Panthers three starting linebackers were also selected to play in the Pro Bowl: Thomas Davis and Luke Kuechly. Davis compiled 5½ sacks, four forced fumbles, and four interceptions, while Kuechly led the team in tackles (118) forced two fumbles, and intercepted four passes of his own. Carolina's secondary featured Pro Bowl safety Kurt Coleman, who led the team with a career high seven interceptions, while also racking up 88 tackles and Pro Bowl cornerback Josh Norman, who developed into a shutdown corner during the season and had four interceptions, two of which were returned for touchdowns.",
'question': 'How many points did the Panthers defense surrender?',
'answers': {'text': ['308'], 'answer_start': [34]}}
```
### Data Fields
- `id` (`str`): Unique ID for the context-question pair.
- `context` (`str`): Context for the question.
- `question` (`str`): Question.
- `answers` (`dict`): Answers with the following keys:
- `text` (`list` of `str`): Texts of the answers.
- `answer_start` (`list` of `int`): Start positions for every answer text.
### Data Splits
The number of questions and candidate sentences for each language for XQuAD-R is shown in the table below:
| | XQuAD-R | |
|-----|-----------|------------|
| | questions | candidates |
| ar | 1190 | 1222 |
| de | 1190 | 1276 |
| el | 1190 | 1234 |
| en | 1190 | 1180 |
| es | 1190 | 1215 |
| hi | 1190 | 1244 |
| ru | 1190 | 1219 |
| th | 1190 | 852 |
| tr | 1190 | 1167 |
| vi | 1190 | 1209 |
| zh | 1190 | 1196 |
## Dataset Creation
[More Information Needed]
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
[More Information Needed]
### Dataset Curators
The dataset was initially created by Uma Roy, Noah Constant, Rami Al-Rfou, Aditya Barua, Aaron Phillips and Yinfei Yang, during work done at Google Research.
### Licensing Information
XQuAD-R is distributed under the [CC BY-SA 4.0 license](https://creativecommons.org/licenses/by-sa/4.0/legalcode).
### Citation Information
```
@article{roy2020lareqa,
title={LAReQA: Language-agnostic answer retrieval from a multilingual pool},
author={Roy, Uma and Constant, Noah and Al-Rfou, Rami and Barua, Aditya and Phillips, Aaron and Yang, Yinfei},
journal={arXiv preprint arXiv:2004.05484},
year={2020}
}
```
### Contributions
Thanks to [@manandey](https://github.com/manandey) for adding this dataset. |
liuyanchen1015/MULTI_VALUE_stsb_possessives_belong | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 69095
num_examples: 332
- name: test
num_bytes: 45239
num_examples: 238
- name: train
num_bytes: 203264
num_examples: 1043
download_size: 207106
dataset_size: 317598
---
# Dataset Card for "MULTI_VALUE_stsb_possessives_belong"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_v1_recite_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 34574
num_examples: 300
- name: train_recite_qa
num_bytes: 222533
num_examples: 300
- name: eval_qa
num_bytes: 11254
num_examples: 100
- name: eval_recite_qa
num_bytes: 73368
num_examples: 100
- name: all_docs
num_bytes: 248990
num_examples: 392
- name: train
num_bytes: 471523
num_examples: 692
- name: validation
num_bytes: 73368
num_examples: 100
download_size: 0
dataset_size: 1135610
---
# Dataset Card for "lmind_nq_v1_recite_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/GPT4-10k-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 4380667
num_examples: 4052
download_size: 2227926
dataset_size: 4380667
---
# Dataset Card for "GPT4-10k-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lemhf14/EasyJailbreak_Datasets | ---
dataset_info:
- config_name: AdvBench
features:
- name: query
dtype: string
- name: reference_responses
sequence: string
splits:
- name: train
num_bytes: 86245
num_examples: 520
download_size: 35385
dataset_size: 86245
- config_name: ForbiddenQuestion
features:
- name: query
dtype: string
splits:
- name: train
num_bytes: 31734
num_examples: 390
download_size: 17493
dataset_size: 31734
- config_name: MJP
features:
- name: query
dtype: string
- name: reference_responses
sequence: string
splits:
- name: train
num_bytes: 86245
num_examples: 520
download_size: 35385
dataset_size: 86245
- config_name: MaliciousInstruct
features:
- name: query
dtype: string
splits:
- name: train
num_bytes: 6570
num_examples: 100
download_size: 4786
dataset_size: 6570
- config_name: QuestionList
features:
- name: query
dtype: string
splits:
- name: train
num_bytes: 6571
num_examples: 100
download_size: 5258
dataset_size: 6571
configs:
- config_name: AdvBench
data_files:
- split: train
path: AdvBench/train-*
- config_name: ForbiddenQuestion
data_files:
- split: train
path: ForbiddenQuestion/train-*
- config_name: MJP
data_files:
- split: train
path: MJP/train-*
- config_name: MaliciousInstruct
data_files:
- split: train
path: MaliciousInstruct/train-*
- config_name: QuestionList
data_files:
- split: train
path: QuestionList/train-*
---
|
liuyanchen1015/VALUE_wikitext2_lexical | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 1220298
num_examples: 1796
- name: train
num_bytes: 10702271
num_examples: 15501
- name: validation
num_bytes: 1097319
num_examples: 1604
download_size: 7724483
dataset_size: 13019888
---
# Dataset Card for "VALUE_wikitext2_lexical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/silence_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of silence/サイレンス/赫默 (Arknights)
This is the dataset of silence/サイレンス/赫默 (Arknights), containing 483 images and their tags.
The core tags of this character are `brown_hair, short_hair, glasses, feather_hair, owl_ears, semi-rimless_eyewear, brown_eyes, hair_between_eyes, under-rim_eyewear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 483 | 730.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silence_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 483 | 379.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silence_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1107 | 799.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silence_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 483 | 627.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silence_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1107 | 1.17 GiB | [Download](https://huggingface.co/datasets/CyberHarem/silence_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/silence_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, solo, black_dress, long_hair, looking_at_viewer, off_shoulder, yellow_eyes, ahoge, fingerless_gloves, id_card, cowboy_shot, earrings, parted_lips, rhine_lab_logo, single_glove, drone, holding_removed_eyewear, jacket, medium_breasts, sleeveless, unworn_eyewear |
| 1 | 12 |  |  |  |  |  | 1girl, armband, hand_up, long_sleeves, solo, upper_body, rhine_lab_logo, simple_background, looking_at_viewer, white_background, closed_mouth, adjusting_eyewear, black-framed_eyewear, id_card, jacket |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, portrait, simple_background, solo, white_background, jacket, closed_mouth, orange_eyes |
| 3 | 15 |  |  |  |  |  | 1girl, black_thighhighs, long_sleeves, looking_at_viewer, rhine_lab_logo, simple_background, solo, thigh_strap, single_thighhigh, armband, white_background, cowboy_shot, jacket, black-framed_eyewear, id_card, orange_eyes, vial, closed_mouth |
| 4 | 9 |  |  |  |  |  | 1girl, armband, black_thighhighs, full_body, long_sleeves, rhine_lab_logo, simple_background, single_thighhigh, solo, asymmetrical_legwear, looking_at_viewer, single_sock, white_background, id_card, infection_monitor_(arknights), jacket, thigh_strap, standing, wide_sleeves, ahoge, antenna_hair, black_socks, holding, wings |
| 5 | 16 |  |  |  |  |  | long_sleeves, 1girl, id_card, looking_at_viewer, solo, black_shorts, green_gloves, closed_mouth, rhine_lab_logo, holding, yellow_coat, white_shirt, yellow_eyes, lanyard, open_coat, standing, full_body, mask_around_neck, yellow_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | black_dress | long_hair | looking_at_viewer | off_shoulder | yellow_eyes | ahoge | fingerless_gloves | id_card | cowboy_shot | earrings | parted_lips | rhine_lab_logo | single_glove | drone | holding_removed_eyewear | jacket | medium_breasts | sleeveless | unworn_eyewear | armband | hand_up | long_sleeves | upper_body | simple_background | white_background | closed_mouth | adjusting_eyewear | black-framed_eyewear | portrait | orange_eyes | black_thighhighs | thigh_strap | single_thighhigh | vial | full_body | asymmetrical_legwear | single_sock | infection_monitor_(arknights) | standing | wide_sleeves | antenna_hair | black_socks | holding | wings | black_shorts | green_gloves | yellow_coat | white_shirt | lanyard | open_coat | mask_around_neck | yellow_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:------------|:--------------------|:---------------|:--------------|:--------|:--------------------|:----------|:--------------|:-----------|:--------------|:-----------------|:---------------|:--------|:--------------------------|:---------|:-----------------|:-------------|:-----------------|:----------|:----------|:---------------|:-------------|:--------------------|:-------------------|:---------------|:--------------------|:-----------------------|:-----------|:--------------|:-------------------|:--------------|:-------------------|:-------|:------------|:-----------------------|:--------------|:--------------------------------|:-----------|:---------------|:---------------|:--------------|:----------|:--------|:---------------|:---------------|:--------------|:--------------|:----------|:------------|:-------------------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | | | X | | | | | X | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | | | | X | | | | | | | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | | X | | | X | | | | | X | X | | | X | | | | X | | | | X | | X | | X | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | X | | | X | | X | | | | X | | | | X | | | | X | | X | | X | X | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | X | | | X | | X | | | X | | | | X | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | X | | | | X | | X | X | X | X | X | X | X | X |
|
Multimodal-Fatima/VQAv2_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_100_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_32
num_bytes: 14423
num_examples: 100
download_size: 8281
dataset_size: 14423
---
# Dataset Card for "VQAv2_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_100_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_visualizing_soil | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 266816
num_examples: 6064
- name: validation
num_bytes: 113388
num_examples: 2577
download_size: 226448
dataset_size: 380204
---
# Dataset Card for "metatree_visualizing_soil"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_definite_for_indefinite_articles | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 58373
num_examples: 403
- name: test
num_bytes: 123346
num_examples: 858
- name: train
num_bytes: 1963616
num_examples: 18212
download_size: 1251253
dataset_size: 2145335
---
# Dataset Card for "MULTI_VALUE_sst2_definite_for_indefinite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haris001/RAG_DS | ---
license: apache-2.0
---
|
ovior/twitter_dataset_1713135100 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2641051
num_examples: 8193
download_size: 1484453
dataset_size: 2641051
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_57 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 42815664
num_examples: 4441
download_size: 12218172
dataset_size: 42815664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_57"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
leebissessar5/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22842342.0
num_examples: 100
download_size: 22823707
dataset_size: 22842342.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xinke-wang/ocr-masks | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1087540216.0
num_examples: 1138
download_size: 1087038711
dataset_size: 1087540216.0
---
# Dataset Card for "ocr-masks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VLyb/FB15k | ---
license: unlicense
language:
- en
tags:
- link-prediction
pretty_name: FB15k
size_categories:
- 10K<n<100K
---
# FB15k Dataset
The details of it can be got by this paper titled:
+ [Translating Embeddings for Modeling Multi-relational Data](http://dl.acm.org/doi/10.5555/2999792.2999923) |
HathawayLiu/housing_dataset | ---
language:
- en
size_categories:
- 100K<n<1M
tags:
- housing
- permits
- Seattle
dataset_info:
features:
- name: PermitNum
dtype: string
- name: PermitClass
dtype: string
- name: PermitClassMapped
dtype: string
- name: PermitTypeMapped
dtype: string
- name: PermitTypeDesc
dtype: string
- name: Description
dtype: string
- name: HousingUnits
dtype: int64
- name: HousingUnitsRemoved
dtype: int64
- name: HousingUnitsAdded
dtype: int64
- name: EstProjectCost
dtype: float32
- name: AppliedDate
dtype: string
- name: IssuedDate
dtype: string
- name: ExpiresDate
dtype: string
- name: CompletedDate
dtype: string
- name: StatusCurrent
dtype: string
- name: RelatedMup
dtype: string
- name: OriginalAddress1
dtype: string
- name: OriginalCity
dtype: string
- name: OriginalState
dtype: string
- name: OriginalZip
dtype: int64
- name: ContractorCompanyName
dtype: string
- name: Link
dtype: string
- name: Latitude
dtype: float32
- name: Longitude
dtype: float32
- name: Location1
dtype: string
- name: NeighborDistrict
dtype: string
splits:
- name: train
num_bytes: 47214591
num_examples: 97541
- name: test
num_bytes: 11802066
num_examples: 24388
download_size: 18076020
dataset_size: 59016657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for Housing_Dataset
This typical dataset contains all the building permits issued or in progress
within the city of Seattle starting from 2000 to recent, and this dataset is
still updating as time flows. Information includes permit records urls,
detailed address, and building costs etc., which will be presented in the `housing_dataset.py`
file and the following description
## Dataset Details
### Dataset Description
This [**Seattle Housing permits dataset**](https://data.seattle.gov/Permitting/Building-Permits/76t5-zqzr/about_data)
is authorized by Seattle Government and could be found in Seattle Government open data portal.
The Building Permits dataset from the City of Seattle's Open Data portal provides comprehensive information about building permits issued or currently in progress within Seattle.
This dataset, which dates back to 1990 and continues to be updated, includes a wide range of details such as permit numbers, types, descriptions,
estimated project costs, and related contractor information could be found in the .csv table in the official website, which in total contains 25 columns.
Moreover, Seattle is divided in 13 Neighborhood District. Based on the [Seattle Neighborhood District GeoJson File](https://data-seattlecitygis.opendata.arcgis.com/datasets/SeattleCityGIS::neighborhood-map-atlas-districts/about) found on Seattle government website,
there will a new column created, namely NeighborhoodDistrict. With the provided GeoJson file, every housing will be assigned to the corresponding
neighborhood district using the `Latitude` and `Longitude` columns in the csv for future usage.
- **Curated by:** [Seattle Government Open data portal](https://data.seattle.gov/)
- **Language(s) (NLP):** [English]
- **License:** [Public Domain by Seattle Government](http://www.seattle.gov/sdci)
### Dataset Sources
- **Offical Website:** [https://data.seattle.gov/]
- **Repository for Cleaned Dataset:** [https://github.com/HathawayLiu/Housing_dataset]
## Uses
The Building Permits dataset from the City of Seattle is intended for use in various urban development and research applications.
It can assist in understanding building trends in Seattle, aid city planning, and support academic research on urban development.
The dataset is also a valuable tool for residents and businesses to stay informed about construction activities and regulations in the city.
Specifically for residents, this dataset provides starting information for choosing future housing by looking at housing cost, neighborhood district,
and other information in the dataset.
Additionally, it supports transparency and public engagement in city planning processes.
### Direct Use
The Building Permits dataset from the City of Seattle is suitable for several use cases:
- **Urban Planning and Development:** Planners and developers can analyze trends in building permits to inform city development strategies and infrastructure planning.
- **Academic Research:** Researchers in urban studies, economics, and social sciences can use the data for studies on urban growth, housing, and economic activity.
- **Real Estate Analysis:** Real estate professionals can assess building activities in neighborhoods for market analysis and investment decisions.
- **Public Awareness:** The general public can use this data to stay informed about construction activities and developmental changes in their community.
- **Government and Policy Making:** Local government officials can utilize this data to make informed decisions on housing policies, zoning laws,
and community development projects.
- **Residents housing choice:** Residents could access this dataset for relevant information for their future housing choice.
### Out-of-Scope Use
The Building Permits dataset from the City of Seattle should not be used for purposes that could infringe on privacy or for activities that are not in line
with ethical standards. This includes any form of misuse or malicious use such as targeting individuals or businesses based on the information provided in the dataset.
Additionally, the dataset may not be suitable for applications requiring highly specialized or non-public information about building structures,
as it primarily contains permit-related data.
## Dataset Structure
The cleaned and modified full dataset[`Building_Permits_Cleaned.csv`], the splited train[`housing_train_dataset.csv`] and test[`housing_test_dataset.csv`] dataset
are provided in the following Github Repo: [https://github.com/HathawayLiu/Housing_dataset]. The cleaned train and test dataset are also provided in the **`data`**
folder of this repo.
The cleaned dataset in total contains 26 columns:
- **`PermitNum`(string):** The tracking number used to refer to this permit in SDCI's tracking system.
- **`PermitClass`(string):** The permit class tells you the type of project.
- **`PermitClassMapped`(string):** A description of whether the permit is for a residential or non-residential project.
- **`PermitTypeMapped`(string):** The permit type by category, such as building, demolition, roofing, grading, and environmentally critical areas.
- **`PermitTypeDesc`(string):** Additional information about the type of permit. For example, whether it is an addition/alternation or a new project.
- **`Description`(string):** A brief description of the work that will be done under this permit. This description is subject to change before SDCI issues the permit. The description is generally more stable if we have issued the permit. Very long descriptions have been truncated.
- **`HousingUnits`(int):** The number of housing units included at the beginning of the project.
- **`HousingUnitsRemoved`(int)** The number of housing units removed during the project.
- **`HousingUnitsAdded`(int):** The number of housing units added during the project.
- **`EstProjectCost`(float):** The estimated project cost of the work being proposed is based on fair market value (parts plus labor). The estimated cost (if any) represents the best available information to date, and is subject to change if the project is modified. We do not collect the estimated project cost for all permit types.
- **`AppliedDate`(string):** The date SDCI accepted the application as a complete submittal.
- **`IssuedDate`(string):** The date SDCI issued the permit. If there is an Application Date but no Issue Date, this generally means the application is still under review.
- **`ExpiresDate`(string):** The date the application is due to expire. Generally, this is the date by which work is supposed to be completed (barring renewals or further extensions). If there is not an Expiration Date, this generally means the permit has not been issued.
- **`CompletedDate`(string):** The date the permit had all its inspections completed. If there is an Issue Date but not a Completed Date, this generally means the permit is still under inspection.
- **`RelatedMup`(string):** The land use permit that is related to this building permit, if there is one.
- **`OriginalAddress1`(string):** The street name and number of the project.
- **`OriginalCity`(string):** The city for the project's address.
- **`OriginalState`(string):** The state for the project's address.
- **`OriginalZip`(string):** The Zip code for the project's address.
- **`ContractorCompanyName`(string):** The contractor(s) associated with this permit.
- **`Link`(string):** A link to view full details and current status information about this permit at SDCI's website.
- **`Latitude`(float):** Latitude of the worksite where permit activity occurs. May be missing for a small number of permits considered "unaddressable."
- **`Longitude`(float):** Longitude of the worksite where permit activity occurs. May be missing for a small number of permits considered "unaddressable."
- **`Location1`(string):** The latitude and longitude location for mapping purposes.
- (New added column)**`NeighborhoodDistrict`(string):** The district that the housing belongs to according to location
## Dataset Creation
### Curation Rationale
The Building Permits dataset from the City of Seattle was created to foster transparency, public awareness, and engagement in the city's urban development processes.
It provides residents, professionals, and researchers with detailed information about building activities, facilitating informed decision-making and community involvement in city planning and development.
Regarding the importance fo 13 neighborhood districts in Seattle, the new added columns for corresponding neighborhood district gives chance for residents and government
to investigate the building activities and life quality in the aspect of different neighborhood districts.
The dataset supports the city's commitment to open data and the promotion of data-driven insights for improving urban infrastructure and living conditions.
#### Data Collection and Processing
The Building Permits dataset is collected by Seattle Government where it contains all of the recent information about housing permits in Seattle. The dataset is published on
Seattle Government Open Data Portal and it's keep updating along with time. You can download the raw data from [Seattle Government Website](https://data.seattle.gov/Permitting/Building-Permits/76t5-zqzr/about_data)
in different formats. For my own purpose I downloaded the CSV version that updated until the modified time of this repo and you can find it in the following Github Repo:[https://github.com/HathawayLiu/Housing_dataset]
(File name: `Building_Permits_20240213.csv`). To process and clean the dataset, I did the following steps:
1. Pre-process the data to make sure that they are in the correct types.
2. Use the provided `latitude` and `longitude` columns in the dataset along with Google GeoCoding API to fill in the blanks for the `OriginalZip`(Zip code) column.
3. Use the provided `latitude` and `longitude` columns and the GeoJSon file of Seattle Neighborhood District to assign building permits to their corresponding neighborhood districts.
4. (The GeoJSon file of Seattle Neighborhood District could be found under this GitHub Repo:[https://github.com/HathawayLiu/Housing_dataset]. You could also download it through Seattle GeoData Portal:https://data-seattlecitygis.opendata.arcgis.com/datasets/SeattleCityGIS::neighborhood-map-atlas-districts/about)
5. Fill in the blanks left in the dataset with `N/A` for easier future use
6. Split the dataset into train and test set for future use.
For more details about data cleaning and processing, you could refer to the `data_cleaning.py` file under this repo. Notice that to be able to use the function to get zipcode,
you need to use your own API Key. Applying for a Google GeoCoding API is free. You could simply follow this link to apply it: https://developers.google.com/maps/documentation/geocoding/get-api-key
You are more than welcome to download the raw data and process the dataset yourself.
To load the dataset, you could use the following command:
```python
!pip install datasets
from datasets import load_dataset
dataset = load_dataset("HathawayLiu/housing_dataset", trust_remote_code=True)
```
To generate the exmaple from train/test set, use:
```python
next(iter(dataset['train']))
## next(iter(dataset['test']))
```
You can see the example from dataset like the following:
```
{'PermitNum': '6075593-CN',
'PermitClass': 'Single Family/Duplex',
'PermitClassMapped': 'Residential',
'PermitTypeMapped': 'Building',
'PermitTypeDesc': 'Addition/Alteration',
'Description': 'Replace existing windows; Upgrade new windows and framing for existing single family residence subject to field inspection',
'HousingUnits': 0,
'HousingUnitsRemoved': 0,
'HousingUnitsAdded': 0,
'EstProjectCost': 43014.0,
'AppliedDate': '10/12/05',
'IssuedDate': '10/12/05',
'ExpiresDate': '4/12/07',
'CompletedDate': '2/1/06',
'StatusCurrent': 'Completed',
'RelatedMup': 'nan',
'OriginalAddress1': '624 NW 88TH ST',
'OriginalCity': 'SEATTLE',
'OriginalState': 'WA',
'OriginalZip': 98117,
'ContractorCompanyName': 'STATEWIDE INC',
'Link': 'https://cosaccela.seattle.gov/portal/customize/LinkToRecord.aspx?altId=6075593-CN',
'Latitude': 47.692996978759766,
'Longitude': -122.36441040039062,
'Location1': '47.69299754, -122.3644121',
'NeighborDistrict': 'Northwest'}
```
#### Who are the source data producers?
The Building Permits dataset is originally created and maintained by the City of Seattle, specifically by its Department of Construction and Inspections.
This department is responsible for overseeing building and land use in Seattle, ensuring safety and compliance with city codes.
The dataset reflects the department's ongoing work in managing and documenting building permits issued in the city.
For detailed information, visit the [Seattle Department of Construction & Inspections](https://www.seattle.gov/sdci).
## Bias, Risks, and Limitations
The Building Permits dataset from the City of Seattle has both technical and sociotechnical limitations:
1. **Technical Limitations**:
- **Data Completeness**: Not all building permits may be captured, especially older records. Data for specific columns like `IssuedDate`, `CompletedDate`, `AppliedDate`,
`RelatedMup`, and etc. contains lots of missing values.
- **Data Accuracy**: There may be errors or inconsistencies in the data, especially in historical records.
- **Timeliness**: The dataset might not be updated in real-time, causing delays in reflecting the most current information.
2. **Sociotechnical Limitations**:
- **Privacy Concerns**: Detailed permit data could potentially be used to infer private information about property owners or residents.
- **Bias in Planning Decisions**: The data might be used to reinforce existing biases in urban planning, affecting marginalized communities.
- **Dependence on Technical Proficiency**: The dataset's utility is limited by the user's ability to interpret and analyze the data effectively.
3. **Bias**: The dataset reflects only permitted construction, not all building activities. This can bias analyses towards formal, recorded developments, overlooking informal or unpermitted construction.
4. **Risk**: Misuse can occur if data is used to unfairly target specific neighborhoods or communities for enforcement or political reasons.
These limitations should be considered when using this dataset for research, policy-making, or urban planning.
### Recommendations
To address the bias and limitations above, users should intake the following recommendations:
- **Cross-Verification**: Use supplementary data sources for a more comprehensive view.
- **Privacy and Ethical Use**: Handle data responsibly, respecting privacy and avoiding discriminatory practices.
- **Data Cleaning and Validation**: Regularly update and clean the dataset to maintain accuracy and reliability.
|
sshreyy/CAI_DC_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 57851094
num_examples: 25085
- name: test
num_bytes: 9688350
num_examples: 4203
download_size: 25349627
dataset_size: 67539444
---
# Dataset Card for "CAI_DC_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
frankwilsonv3/test-court-data | ---
license: apache-2.0
---
|
FaalSa/f2 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 79710
num_examples: 1
- name: validation
num_bytes: 80190
num_examples: 1
- name: test
num_bytes: 80670
num_examples: 1
download_size: 61692
dataset_size: 240570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mahdibaghbanzadeh/GUE_tf_0 | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 3658714
num_examples: 32378
- name: val
num_bytes: 113000
num_examples: 1000
- name: test
num_bytes: 113000
num_examples: 1000
download_size: 1766031
dataset_size: 3884714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
tonarie/Wayback-Data-Youtube-Homepage-Videos | ---
license: mit
---
Dataset capturing info for each video feature instance on the youtube homepage, 2005-2010, based on scraped wayback snapshots. |
open-llm-leaderboard/details_damerajee__Gaja-vv1 | ---
pretty_name: Evaluation run of damerajee/Gaja-vv1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [damerajee/Gaja-vv1](https://huggingface.co/damerajee/Gaja-vv1) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_damerajee__Gaja-vv1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T20:00:37.330045](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Gaja-vv1/blob/main/results_2024-02-29T20-00-37.330045.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.40202187828772123,\n\
\ \"acc_stderr\": 0.034099790578226846,\n \"acc_norm\": 0.4073931782955521,\n\
\ \"acc_norm_stderr\": 0.03498851926157876,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326926,\n \"mc2\": 0.42317616925273904,\n\
\ \"mc2_stderr\": 0.014617998089721076\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866978,\n\
\ \"acc_norm\": 0.515358361774744,\n \"acc_norm_stderr\": 0.01460449612939491\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5608444532961562,\n\
\ \"acc_stderr\": 0.004952698802275648,\n \"acc_norm\": 0.7549292969527982,\n\
\ \"acc_norm_stderr\": 0.004292500501716205\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.18421052631578946,\n\
\ \"acc_stderr\": 0.03646758875075566,\n \"acc_norm\": 0.18421052631578946,\n\
\ \"acc_norm_stderr\": 0.03646758875075566\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4292929292929293,\n \"acc_stderr\": 0.035265527246011986,\n \"\
acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.035265527246011986\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.03602573571288441,\n\
\ \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.03602573571288441\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.02463554916390823,\n \
\ \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.02463554916390823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43486238532110094,\n \"acc_stderr\": 0.02125463146560927,\n \"\
acc_norm\": 0.43486238532110094,\n \"acc_norm_stderr\": 0.02125463146560927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012397,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012397\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5358649789029536,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.5358649789029536,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.03355746535223264,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.03355746535223264\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3496932515337423,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.3496932515337423,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5512820512820513,\n\
\ \"acc_stderr\": 0.032583346493868806,\n \"acc_norm\": 0.5512820512820513,\n\
\ \"acc_norm_stderr\": 0.032583346493868806\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5223499361430396,\n\
\ \"acc_stderr\": 0.01786209177850786,\n \"acc_norm\": 0.5223499361430396,\n\
\ \"acc_norm_stderr\": 0.01786209177850786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4479768786127168,\n \"acc_stderr\": 0.026772990653361823,\n\
\ \"acc_norm\": 0.4479768786127168,\n \"acc_norm_stderr\": 0.026772990653361823\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497736,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497736\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.02824513402438729,\n\
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.02824513402438729\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4790996784565916,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.4790996784565916,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.43209876543209874,\n \"acc_stderr\": 0.02756301097160667,\n\
\ \"acc_norm\": 0.43209876543209874,\n \"acc_norm_stderr\": 0.02756301097160667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028121636040639886,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028121636040639886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32529335071707954,\n\
\ \"acc_stderr\": 0.011965311536571528,\n \"acc_norm\": 0.32529335071707954,\n\
\ \"acc_norm_stderr\": 0.011965311536571528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39705882352941174,\n \"acc_stderr\": 0.019794488900024113,\n \
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.019794488900024113\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4489795918367347,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.4489795918367347,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234214,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234214\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326926,\n \"mc2\": 0.42317616925273904,\n\
\ \"mc2_stderr\": 0.014617998089721076\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.0025049422268605083\n }\n}\n```"
repo_url: https://huggingface.co/damerajee/Gaja-vv1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-00-37.330045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-00-37.330045.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- '**/details_harness|winogrande|5_2024-02-29T20-00-37.330045.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T20-00-37.330045.parquet'
- config_name: results
data_files:
- split: 2024_02_29T20_00_37.330045
path:
- results_2024-02-29T20-00-37.330045.parquet
- split: latest
path:
- results_2024-02-29T20-00-37.330045.parquet
---
# Dataset Card for Evaluation run of damerajee/Gaja-vv1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [damerajee/Gaja-vv1](https://huggingface.co/damerajee/Gaja-vv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_damerajee__Gaja-vv1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T20:00:37.330045](https://huggingface.co/datasets/open-llm-leaderboard/details_damerajee__Gaja-vv1/blob/main/results_2024-02-29T20-00-37.330045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.40202187828772123,
"acc_stderr": 0.034099790578226846,
"acc_norm": 0.4073931782955521,
"acc_norm_stderr": 0.03498851926157876,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326926,
"mc2": 0.42317616925273904,
"mc2_stderr": 0.014617998089721076
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866978,
"acc_norm": 0.515358361774744,
"acc_norm_stderr": 0.01460449612939491
},
"harness|hellaswag|10": {
"acc": 0.5608444532961562,
"acc_stderr": 0.004952698802275648,
"acc_norm": 0.7549292969527982,
"acc_norm_stderr": 0.004292500501716205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.03646758875075566,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.03646758875075566
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.02815603653823321,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.02815603653823321
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5284974093264249,
"acc_stderr": 0.03602573571288441,
"acc_norm": 0.5284974093264249,
"acc_norm_stderr": 0.03602573571288441
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.382051282051282,
"acc_stderr": 0.02463554916390823,
"acc_norm": 0.382051282051282,
"acc_norm_stderr": 0.02463554916390823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871934,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871934
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43486238532110094,
"acc_stderr": 0.02125463146560927,
"acc_norm": 0.43486238532110094,
"acc_norm_stderr": 0.02125463146560927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012397,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012397
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5358649789029536,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.5358649789029536,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.03355746535223264,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.03355746535223264
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3496932515337423,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.3496932515337423,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.032583346493868806,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.032583346493868806
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5223499361430396,
"acc_stderr": 0.01786209177850786,
"acc_norm": 0.5223499361430396,
"acc_norm_stderr": 0.01786209177850786
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4479768786127168,
"acc_stderr": 0.026772990653361823,
"acc_norm": 0.4479768786127168,
"acc_norm_stderr": 0.026772990653361823
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497736,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497736
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4790996784565916,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.4790996784565916,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.43209876543209874,
"acc_stderr": 0.02756301097160667,
"acc_norm": 0.43209876543209874,
"acc_norm_stderr": 0.02756301097160667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028121636040639886,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028121636040639886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32529335071707954,
"acc_stderr": 0.011965311536571528,
"acc_norm": 0.32529335071707954,
"acc_norm_stderr": 0.011965311536571528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.019794488900024113,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.019794488900024113
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4489795918367347,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.4489795918367347,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137288,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137288
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.03786720706234214,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.03786720706234214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326926,
"mc2": 0.42317616925273904,
"mc2_stderr": 0.014617998089721076
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798499
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605083
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Orange/webnlg-qa | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: category
dtype: string
- name: size
dtype: int32
- name: id
dtype: string
- name: eid
dtype: string
- name: original_triple_sets
list:
- name: subject
dtype: string
- name: property
dtype: string
- name: object
dtype: string
- name: modified_triple_sets
list:
- name: subject
dtype: string
- name: property
dtype: string
- name: object
dtype: string
- name: shape
dtype: string
- name: shape_type
dtype: string
- name: lex
sequence:
- name: comment
dtype: string
- name: lid
dtype: string
- name: text
dtype: string
- name: lang
dtype: string
- name: test_category
dtype: string
- name: dbpedia_links
sequence: string
- name: links
sequence: string
- name: graph
list:
list: string
- name: main_entity
dtype: string
- name: mappings
list:
- name: modified
dtype: string
- name: readable
dtype: string
- name: graph
dtype: string
- name: dialogue
list:
- name: question
list:
- name: source
dtype: string
- name: text
dtype: string
- name: graph_query
dtype: string
- name: readable_query
dtype: string
- name: graph_answer
list: string
- name: readable_answer
list: string
- name: type
list: string
splits:
- name: train
num_bytes: 33200723
num_examples: 10016
- name: validation
num_bytes: 4196972
num_examples: 1264
- name: test
num_bytes: 4990595
num_examples: 1417
- name: challenge
num_bytes: 420551
num_examples: 100
download_size: 9637685
dataset_size: 42808841
task_categories:
- conversational
- question-answering
- text-generation
tags:
- qa
- knowledge-graph
- sparql
language:
- en
---
# Dataset Card for WEBNLG-QA
## Dataset Description
- **Paper:** [SPARQL-to-Text Question Generation for Knowledge-Based Conversational Applications (AACL-IJCNLP 2022)](https://aclanthology.org/2022.aacl-main.11/)
- **Point of Contact:** Gwénolé Lecorvé
### Dataset Summary
WEBNLG-QA is a conversational question answering dataset grounded on WEBNLG. It consists in a set of question-answering dialogues (follow-up question-answer pairs) based on short paragraphs of text. Each paragraph is associated a knowledge graph (from WEBNLG). The questions are associated with SPARQL queries.
### Supported tasks
* Knowledge-based question-answering
* SPARQL-to-Text conversion
#### Knowledge based question-answering
Below is an example of dialogue:
- Q1: What is used as an instrument is Sludge Metal or in Post-metal?
- A1: Singing, Synthesizer
- Q2: And what about Sludge Metal in particular?
- A2: Singing
- Q3: Does the Year of No Light album Nord belong to this genre?
- A3: Yes.
#### SPARQL-to-Text Question Generation
SPARQL-to-Text question generation refers to the task of converting a SPARQL query into a natural language question, eg:
```SQL
SELECT (COUNT(?country) as ?answer)
WHERE { ?country property:member_of resource:Europe .
?country property:population ?n .
FILTER ( ?n > 10000000 )
}
```
could be converted into:
```txt
How many European countries have more than 10 million inhabitants?
```
## Dataset Structure
### Types of questions
Comparison of question types compared to related datasets:
| | | [SimpleQuestions](https://huggingface.co/datasets/OrangeInnov/simplequestions-sparqltotext) | [ParaQA](https://huggingface.co/datasets/OrangeInnov/paraqa-sparqltotext) | [LC-QuAD 2.0](https://huggingface.co/datasets/OrangeInnov/lcquad_2.0-sparqltotext) | [CSQA](https://huggingface.co/datasets/OrangeInnov/csqa-sparqltotext) | [WebNLQ-QA](https://huggingface.co/datasets/OrangeInnov/webnlg-qa) |
|--------------------------|-----------------|:---------------:|:------:|:-----------:|:----:|:---------:|
| **Number of triplets in query** | 1 | ✓ | ✓ | ✓ | ✓ | ✓ |
| | 2 | | ✓ | ✓ | ✓ | ✓ |
| | More | | | ✓ | ✓ | ✓ |
| **Logical connector between triplets** | Conjunction | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Disjunction | | | | ✓ | ✓ |
| | Exclusion | | | | ✓ | ✓ |
| **Topology of the query graph** | Direct | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Sibling | | ✓ | ✓ | ✓ | ✓ |
| | Chain | | ✓ | ✓ | ✓ | ✓ |
| | Mixed | | | ✓ | | ✓ |
| | Other | | ✓ | ✓ | ✓ | ✓ |
| **Variable typing in the query** | None | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Target variable | | ✓ | ✓ | ✓ | ✓ |
| | Internal variable | | ✓ | ✓ | ✓ | ✓ |
| **Comparisons clauses** | None | ✓ | ✓ | ✓ | ✓ | ✓ |
| | String | | | ✓ | | ✓ |
| | Number | | | ✓ | ✓ | ✓ |
| | Date | | | ✓ | | ✓ |
| **Superlative clauses** | No | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Yes | | | | ✓ | |
| **Answer type** | Entity (open) | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Entity (closed) | | | | ✓ | ✓ |
| | Number | | | ✓ | ✓ | ✓ |
| | Boolean | | ✓ | ✓ | ✓ | ✓ |
| **Answer cardinality** | 0 (unanswerable) | | | ✓ | | ✓ |
| | 1 | ✓ | ✓ | ✓ | ✓ | ✓ |
| | More | | ✓ | ✓ | ✓ | ✓ |
| **Number of target variables** | 0 (⇒ ASK verb) | | ✓ | ✓ | ✓ | ✓ |
| | 1 | ✓ | ✓ | ✓ | ✓ | ✓ |
| | 2 | | | ✓ | | ✓ |
| **Dialogue context** | Self-sufficient | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Coreference | | | | ✓ | ✓ |
| | Ellipsis | | | | ✓ | ✓ |
| **Meaning** | Meaningful | ✓ | ✓ | ✓ | ✓ | ✓ |
| | Non-sense | | | | | ✓ |
### Data splits
Text verbalization is only available for a subset of the test set, referred to as *challenge set*. Other sample only contain dialogues in the form of follow-up sparql queries.
| | Train | Validation | Test | Challenge |
| --------------------- | ---------- | ---------- | ---------- | ------------ |
| Questions | 27727 | 3485 | 4179 | 332 |
| Dialogues | 1001 | 1264 | 1417 | 100 |
| NL question per query | 0 | 0 | 0 | 2 |
| Characters per query | 129 (± 43) | 131 (± 45) | 122 (± 45) | 113 (± 38) |
| Tokens per question | - | - | - | 8.4 (± 4.5) |
## Additional information
### Related datasets
This corpus is part of a set of 5 datasets released for SPARQL-to-Text generation, namely:
- Non conversational datasets
- [SimpleQuestions](https://huggingface.co/datasets/OrangeInnov/simplequestions-sparqltotext) (from https://github.com/askplatypus/wikidata-simplequestions)
- [ParaQA](https://huggingface.co/datasets/OrangeInnov/paraqa-sparqltotext) (from https://github.com/barshana-banerjee/ParaQA)
- [LC-QuAD 2.0](https://huggingface.co/datasets/OrangeInnov/lcquad_2.0-sparqltotext) (from http://lc-quad.sda.tech/)
- Conversational datasets
- [CSQA](https://huggingface.co/datasets/OrangeInnov/csqa-sparqltotext) (from https://amritasaha1812.github.io/CSQA/)
- [WebNLQ-QA](https://huggingface.co/datasets/OrangeInnov/webnlg-qa) (derived from https://gitlab.com/shimorina/webnlg-dataset/-/tree/master/release_v3.0)
### Licencing information
* Content from original dataset: CC-BY-SA 4.0
* New content: CC BY-SA 4.0
### Citation information
#### This dataset
```bibtex
@inproceedings{lecorve2022sparql2text,
title={SPARQL-to-Text Question Generation for Knowledge-Based Conversational Applications},
author={Lecorv\'e, Gw\'enol\'e and Veyret, Morgan and Brabant, Quentin and Rojas-Barahona, Lina M.},
journal={Proceedings of the Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing (AACL-IJCNLP)},
year={2022}
}
```
#### The underlying corpus WEBNLG 3.0
```bibtex
@inproceedings{castro-ferreira-etal-2020-2020,
title = "The 2020 Bilingual, Bi-Directional {W}eb{NLG}+ Shared Task: Overview and Evaluation Results ({W}eb{NLG}+ 2020)",
author = "Castro Ferreira, Thiago and Gardent, Claire and Ilinykh, Nikolai and van der Lee, Chris and Mille, Simon and Moussallem, Diego and Shimorina, Anastasia",
booktitle = "Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+)",
year = "2020",
pages = "55--76"
}
```
|
alirzb/SeizureClassifier_Wav2Vec_43243531_on_Bal_43827959 | ---
dataset_info:
features:
- name: array
sequence: float64
- name: label_true
dtype: int64
- name: label_pred
dtype: int64
splits:
- name: train
num_bytes: 164995272
num_examples: 402
download_size: 89805200
dataset_size: 164995272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hyperdemocracy/usc-nomic-chunks-v1-s8192-o512 | ---
configs:
- config_name: default
data_files:
- path: data/usc-113-nomic-chunks-v1-s8192-o512.parquet
split: '113'
- path: data/usc-114-nomic-chunks-v1-s8192-o512.parquet
split: '114'
- path: data/usc-115-nomic-chunks-v1-s8192-o512.parquet
split: '115'
- path: data/usc-116-nomic-chunks-v1-s8192-o512.parquet
split: '116'
- path: data/usc-117-nomic-chunks-v1-s8192-o512.parquet
split: '117'
- path: data/usc-118-nomic-chunks-v1-s8192-o512.parquet
split: '118'
dataset_info:
features:
- dtype: string
name: chunk_id
- dtype: string
name: congress_num
- dtype: string
name: nomic_topic_depth_1
- dtype: string
name: nomic_topic_depth_2
- dtype: string
name: nomic_topic_depth_3
- dtype: float32
name: nomic_proj_x
- dtype: float32
name: nomic_proj_y
- list:
dtype: float32
name: nomic_vec
- dtype: string
name: text
- name: chunk_metadata
struct:
- dtype: string
name: chunk_id
- dtype: int32
name: chunk_index
- dtype: string
name: congress_num
- dtype: string
name: legis_class
- dtype: string
name: legis_id
- dtype: int32
name: legis_num
- dtype: string
name: legis_type
- dtype: string
name: legis_version
- dtype: int32
name: start_index
- dtype: string
name: text_date
- dtype: string
name: text_id
- name: bill_metadata
struct:
- dtype: string
name: introduced_date
- dtype: string
name: origin_chamber
- dtype: string
name: policy_area
- list:
dtype: string
name: subjects
- list:
- dtype: string
name: bioguide_id
- dtype: string
name: district
- dtype: string
name: first_name
- dtype: string
name: full_name
- dtype: string
name: is_by_request
- dtype: string
name: last_name
- dtype: string
name: middle_name
- dtype: string
name: party
- dtype: string
name: state
- name: identifiers
struct:
- dtype: string
name: bioguide_id
- dtype: string
name: lis_id
- dtype: string
name: gpo_id
name: sponsors
--- |
Imran1/newdatasetbalance | ---
license: mit
---
|
justinlamlamlam/wiki_context_open_orca_v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1957823
num_examples: 424
download_size: 1146610
dataset_size: 1957823
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yangp/chat-gvg-Scrolls | ---
license: apache-2.0
dataset_info:
features:
- name: context
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 57903108
num_examples: 8156
download_size: 12613802
dataset_size: 57903108
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaist-ai/selfee-train | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: instruction
dtype: string
- name: outputs
list:
- name: feedback
dtype: string
- name: output
dtype: string
- name: dataset
dtype: string
- name: output
dtype: string
- name: iteration_truncated
dtype: bool
- name: iteration
dtype: int64
- name: input
dtype: string
splits:
- name: train
num_bytes: 511377846
num_examples: 178331
download_size: 230123988
dataset_size: 511377846
---
|
RoryLiu19/apitest | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 128180
num_examples: 10
download_size: 36266
dataset_size: 128180
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jkeisling/project-gutenberg-top-books-oct-2023 | ---
license: other
license_name: project-gutenberg-license
license_link: https://gutenberg.org/policy/license.html
---
# Project Gutenberg top 1000 titles, Sept-Oct 2023
<!-- Provide a quick summary of the dataset. -->
This is the data (title, author, monthly downloads) and [ember-v1](https://huggingface.co/llmrails/ember-v1) embeddings of the top 1000 most downloaded books on [Project Gutenberg](https://www.gutenberg.org).
All data is directly taken from Project Gutenberg's [Top 1000 page](https://www.gutenberg.org/browse/scores/top1000.php).
I am not affiliated with Project Gutenberg: I've just ported this here for convenience. |
louisbrulenaudet/code-assurances | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code des assurances
source_datasets:
- original
pretty_name: Code des assurances
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code des assurances, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
CyberHarem/space_ishtar_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of space_ishtar/スペース・イシュタル/太空伊什塔尔 (Fate/Grand Order)
This is the dataset of space_ishtar/スペース・イシュタル/太空伊什塔尔 (Fate/Grand Order), containing 353 images and their tags.
The core tags of this character are `long_hair, two_side_up, multicolored_hair, two-tone_hair, parted_bangs, black_hair, red_hair, breasts, very_long_hair, horns, ribbon, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 353 | 532.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/space_ishtar_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 353 | 463.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/space_ishtar_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 860 | 881.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/space_ishtar_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/space_ishtar_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blue_hair, crescent_facial_mark, forehead_mark, fur_trim, looking_at_viewer, pink_hair, red_bodysuit, solo, star-shaped_pupils, yellow_eyes, medium_breasts, black_gloves, open_mouth, ass, smile |
| 1 | 23 |  |  |  |  |  | 1girl, blue_hair, crescent_facial_mark, forehead_mark, looking_at_viewer, solo, yellow_eyes, fur-trimmed_cloak, pink_hair, red_bodysuit, star-shaped_pupils, red_cloak, gloves |
| 2 | 9 |  |  |  |  |  | 1girl, blue_hair, bodysuit, crescent_facial_mark, forehead_mark, fur-trimmed_cloak, solo, blue_eyes, star-shaped_pupils, looking_at_viewer, black_gloves, orb |
| 3 | 5 |  |  |  |  |  | 1girl, black_bodysuit, grey_eyes, hair_bow, katana, looking_at_viewer, small_breasts, solo, belt, cleavage_cutout, closed_mouth, sheathed, gloves, smile |
| 4 | 7 |  |  |  |  |  | 1girl, black_bodysuit, katana, solo, belt, looking_at_viewer, sheathed, small_breasts, space, cleavage_cutout, grey_eyes, medium_breasts, earth_(planet) |
| 5 | 7 |  |  |  |  |  | 1girl, black_bodysuit, holding_sword, katana, looking_at_viewer, solo, small_breasts, grey_eyes, belt, cleavage_cutout, sheath, parted_lips |
| 6 | 10 |  |  |  |  |  | 1girl, belt, fingerless_gloves, navel, solo, yellow_gloves, yellow_shorts, yellow_vest, midriff, short_shorts, looking_at_viewer, smile, gun, hoop_earrings, sheath, hair_ribbon, katana, yellow_footwear, bandaid, holding, knee_boots, full_body |
| 7 | 8 |  |  |  |  |  | 1girl, bandaid, fingerless_gloves, navel, open_vest, solo, yellow_gloves, yellow_shorts, yellow_vest, cropped_vest, hair_ribbon, looking_at_viewer, midriff, short_shorts, smile, bare_shoulders, belt_buckle, blush, collarbone, hoop_earrings, open_mouth, small_breasts, black_belt, one_eye_closed, ;d, bandeau, blue_sky, hand_on_own_hip, outdoors, simple_background, white_background |
| 8 | 7 |  |  |  |  |  | 1girl, black_dress, collared_shirt, hair_ribbon, long_sleeves, looking_at_viewer, neck_ribbon, pinafore_dress, red_ribbon, solo, white_shirt, open_mouth, pleated_dress, simple_background, white_background, black_ribbon, hair_bow, school_uniform, blush, :d, bag, holding |
| 9 | 7 |  |  |  |  |  | looking_at_viewer, rabbit_ears, reverse_bunnysuit, fake_animal_ears, heart_pasties, revealing_clothes, shrug_(clothing), wrist_cuffs, 1girl, heart_maebari, navel, blush, full_body, large_breasts, latex, medium_breasts, multiple_girls, open_mouth, smile, blonde_hair, covered_nipples, long_sleeves, pantyhose, solo, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_hair | crescent_facial_mark | forehead_mark | fur_trim | looking_at_viewer | pink_hair | red_bodysuit | solo | star-shaped_pupils | yellow_eyes | medium_breasts | black_gloves | open_mouth | ass | smile | fur-trimmed_cloak | red_cloak | gloves | bodysuit | blue_eyes | orb | black_bodysuit | grey_eyes | hair_bow | katana | small_breasts | belt | cleavage_cutout | closed_mouth | sheathed | space | earth_(planet) | holding_sword | sheath | parted_lips | fingerless_gloves | navel | yellow_gloves | yellow_shorts | yellow_vest | midriff | short_shorts | gun | hoop_earrings | hair_ribbon | yellow_footwear | bandaid | holding | knee_boots | full_body | open_vest | cropped_vest | bare_shoulders | belt_buckle | blush | collarbone | black_belt | one_eye_closed | ;d | bandeau | blue_sky | hand_on_own_hip | outdoors | simple_background | white_background | black_dress | collared_shirt | long_sleeves | neck_ribbon | pinafore_dress | red_ribbon | white_shirt | pleated_dress | black_ribbon | school_uniform | :d | bag | rabbit_ears | reverse_bunnysuit | fake_animal_ears | heart_pasties | revealing_clothes | shrug_(clothing) | wrist_cuffs | heart_maebari | large_breasts | latex | multiple_girls | blonde_hair | covered_nipples | pantyhose | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-----------------------|:----------------|:-----------|:--------------------|:------------|:---------------|:-------|:---------------------|:--------------|:-----------------|:---------------|:-------------|:------|:--------|:--------------------|:------------|:---------|:-----------|:------------|:------|:-----------------|:------------|:-----------|:---------|:----------------|:-------|:------------------|:---------------|:-----------|:--------|:-----------------|:----------------|:---------|:--------------|:--------------------|:--------|:----------------|:----------------|:--------------|:----------|:---------------|:------|:----------------|:--------------|:------------------|:----------|:----------|:-------------|:------------|:------------|:---------------|:-----------------|:--------------|:--------|:-------------|:-------------|:-----------------|:-----|:----------|:-----------|:------------------|:-----------|:--------------------|:-------------------|:--------------|:-----------------|:---------------|:--------------|:-----------------|:-------------|:--------------|:----------------|:---------------|:-----------------|:-----|:------|:--------------|:--------------------|:-------------------|:----------------|:--------------------|:-------------------|:--------------|:----------------|:----------------|:--------|:-----------------|:--------------|:------------------|:------------|:---------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | | X | | | X | X | | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | | X | | | X | | | X | | | | | | | | | | | X | X | | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | | | | | | | | X | X | | X | X | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | X | | | | | | | | | | X | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | | | X | | | X | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | | | X | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | | X | | | X | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
cyanic-selkie/wikianc | ---
license: cc-by-sa-4.0
pretty_name: WikiAnc
annotations_creators:
- machine-generated
- crowdsourced
language_creators:
- machine-generated
- crowdsourced
task_categories:
- token-classification
multilinguality:
- multilingual
language:
- en
- ceb
- de
- sv
- fr
- nl
- ru
- es
- it
- arz
- pl
- ja
- zh
- vi
- uk
- war
- ar
- pt
- fa
- ca
- sr
- id
- ko
- 'no'
- ce
- fi
- cs
- tr
- hu
- tt
- sh
- ro
#- zh-min-nan
- eu
- ms
- eo
- he
- hy
- da
- bg
- cy
- sk
- azb
- uz
- et
#- simple
- be
- kk
- min
- el
- hr
- lt
- gl
- az
- ur
- sl
- lld
- ka
- nn
- hi
- th
- ta
- bn
- la
- mk
#- zh-yue
- ast
- lv
- af
- tg
- my
- mg
- mr
- sq
- bs
- oc
- te
- ml
- nds
- br
- ky
- sw
- jv
- lmo
- new
- pnb
- vec
- ht
- pms
- ba
- lb
- su
- ku
- ga
- szl
- is
- fy
- cv
- ckb
- pa
- tl
- an
- wuu
- diq
- io
- sco
- vo
- yo
- ne
- ia
- kn
- gu
- als
- ha
- avk
- bar
- crh
- scn
- bpy
- qu
- mn
- nv
- xmf
- ban
- si
- tum
- ps
- ig
- frr
- os
- mzn
#- bat-smg
- or
- sah
- cdo
- gd
- bug
- yi
- sd
- ilo
- am
- nap
- li
- bcl
- fo
- gor
- hsb
#- map-bms
- mai
- shn
- eml
- ace
#- zh-classical
- sa
- as
- wa
- ie
- hyw
- lij
- mhr
- zu
- sn
- hif
- mrj
- bjn
- km
- mni
- hak
#- roa-tara
- pam
- sat
- rue
- nso
- bh
- so
- mi
- se
- myv
- vls
#- nds-nl
- dag
- sc
- co
- ary
- kw
- bo
- vep
- glk
- tk
- kab
- gan
- rw
#- fiu-vro
- ab
- gv
- ug
- nah
- zea
- skr
- frp
- udm
- pcd
- mt
- kv
- csb
- gn
- smn
- ay
- nrm
- ks
- lez
- lfn
- olo
- mwl
- lo
- stq
- ang
- mdf
- fur
- rm
- lad
- kaa
- gom
- ext
- koi
- tyv
- pap
- av
- dsb
- ln
- dty
- tw
#- cbk-zam
- dv
- ksh
- za
- gag
- bxr
- pfl
- lg
- szy
- pag
- blk
- pi
- tay
- haw
- awa
- inh
- krc
- xal
- pdc
- to
- atj
- tcy
- arc
- mnw
- shi
- jam
- kbp
- wo
- anp
- kbd
- nia
- om
- nov
- ki
- nqo
- bi
- xh
- tpi
- ff
- tet
#- roa-rup
- jbo
- fj
- kg
- lbe
- ty
- cu
- guw
- trv
- ami
- srn
- sm
- mad
- alt
- ltg
- gcr
- chr
- tn
- ny
- st
- pih
- got
- rmy
- ee
- pcm
- bm
- ss
- gpe
- ts
- ve
- kcg
- chy
- rn
- ch
- gur
- ik
- ady
- fat
- pnt
- guc
- iu
- pwn
- sg
- din
- ti
- kl
- dz
- cr
tags:
- wikidata
- wikipedia
- wikification
- named-entity-linking
- nel
- entity-linking
- el
- named-entity-disambiguation
- ned
- entity-disambiguation
- ed
configs:
- config_name: ab
data_files:
- split: train
path: "data/ab/train.parquet"
- split: validation
path: "data/ab/validation.parquet"
- config_name: ace
data_files:
- split: train
path: "data/ace/train.parquet"
- split: validation
path: "data/ace/validation.parquet"
- config_name: ady
data_files:
- split: train
path: "data/ady/train.parquet"
- split: validation
path: "data/ady/validation.parquet"
- config_name: af
data_files:
- split: train
path: "data/af/train.parquet"
- split: validation
path: "data/af/validation.parquet"
- config_name: als
data_files:
- split: train
path: "data/als/train.parquet"
- split: validation
path: "data/als/validation.parquet"
- config_name: alt
data_files:
- split: train
path: "data/alt/train.parquet"
- split: validation
path: "data/alt/validation.parquet"
- config_name: am
data_files:
- split: train
path: "data/am/train.parquet"
- split: validation
path: "data/am/validation.parquet"
- config_name: ami
data_files:
- split: train
path: "data/ami/train.parquet"
- split: validation
path: "data/ami/validation.parquet"
- config_name: an
data_files:
- split: train
path: "data/an/train.parquet"
- split: validation
path: "data/an/validation.parquet"
- config_name: ang
data_files:
- split: train
path: "data/ang/train.parquet"
- split: validation
path: "data/ang/validation.parquet"
- config_name: anp
data_files:
- split: train
path: "data/anp/train.parquet"
- split: validation
path: "data/anp/validation.parquet"
- config_name: ar
data_files:
- split: train
path: "data/ar/train.parquet"
- split: validation
path: "data/ar/validation.parquet"
- config_name: arc
data_files:
- split: train
path: "data/arc/train.parquet"
- split: validation
path: "data/arc/validation.parquet"
- config_name: ary
data_files:
- split: train
path: "data/ary/train.parquet"
- split: validation
path: "data/ary/validation.parquet"
- config_name: arz
data_files:
- split: train
path: "data/arz/train.parquet"
- split: validation
path: "data/arz/validation.parquet"
- config_name: as
data_files:
- split: train
path: "data/as/train.parquet"
- split: validation
path: "data/as/validation.parquet"
- config_name: ast
data_files:
- split: train
path: "data/ast/train.parquet"
- split: validation
path: "data/ast/validation.parquet"
- config_name: atj
data_files:
- split: train
path: "data/atj/train.parquet"
- split: validation
path: "data/atj/validation.parquet"
- config_name: av
data_files:
- split: train
path: "data/av/train.parquet"
- split: validation
path: "data/av/validation.parquet"
- config_name: avk
data_files:
- split: train
path: "data/avk/train.parquet"
- split: validation
path: "data/avk/validation.parquet"
- config_name: awa
data_files:
- split: train
path: "data/awa/train.parquet"
- split: validation
path: "data/awa/validation.parquet"
- config_name: ay
data_files:
- split: train
path: "data/ay/train.parquet"
- split: validation
path: "data/ay/validation.parquet"
- config_name: az
data_files:
- split: train
path: "data/az/train.parquet"
- split: validation
path: "data/az/validation.parquet"
- config_name: azb
data_files:
- split: train
path: "data/azb/train.parquet"
- split: validation
path: "data/azb/validation.parquet"
- config_name: ba
data_files:
- split: train
path: "data/ba/train.parquet"
- split: validation
path: "data/ba/validation.parquet"
- config_name: ban
data_files:
- split: train
path: "data/ban/train.parquet"
- split: validation
path: "data/ban/validation.parquet"
- config_name: bar
data_files:
- split: train
path: "data/bar/train.parquet"
- split: validation
path: "data/bar/validation.parquet"
- config_name: bat_smg
data_files:
- split: train
path: "data/bat_smg/train.parquet"
- split: validation
path: "data/bat_smg/validation.parquet"
- config_name: bcl
data_files:
- split: train
path: "data/bcl/train.parquet"
- split: validation
path: "data/bcl/validation.parquet"
- config_name: be
data_files:
- split: train
path: "data/be/train.parquet"
- split: validation
path: "data/be/validation.parquet"
- config_name: bg
data_files:
- split: train
path: "data/bg/train.parquet"
- split: validation
path: "data/bg/validation.parquet"
- config_name: bh
data_files:
- split: train
path: "data/bh/train.parquet"
- split: validation
path: "data/bh/validation.parquet"
- config_name: bi
data_files:
- split: train
path: "data/bi/train.parquet"
- split: validation
path: "data/bi/validation.parquet"
- config_name: bjn
data_files:
- split: train
path: "data/bjn/train.parquet"
- split: validation
path: "data/bjn/validation.parquet"
- config_name: blk
data_files:
- split: train
path: "data/blk/train.parquet"
- split: validation
path: "data/blk/validation.parquet"
- config_name: bm
data_files:
- split: train
path: "data/bm/train.parquet"
- split: validation
path: "data/bm/validation.parquet"
- config_name: bn
data_files:
- split: train
path: "data/bn/train.parquet"
- split: validation
path: "data/bn/validation.parquet"
- config_name: bo
data_files:
- split: train
path: "data/bo/train.parquet"
- split: validation
path: "data/bo/validation.parquet"
- config_name: bpy
data_files:
- split: train
path: "data/bpy/train.parquet"
- split: validation
path: "data/bpy/validation.parquet"
- config_name: br
data_files:
- split: train
path: "data/br/train.parquet"
- split: validation
path: "data/br/validation.parquet"
- config_name: bs
data_files:
- split: train
path: "data/bs/train.parquet"
- split: validation
path: "data/bs/validation.parquet"
- config_name: bug
data_files:
- split: train
path: "data/bug/train.parquet"
- split: validation
path: "data/bug/validation.parquet"
- config_name: bxr
data_files:
- split: train
path: "data/bxr/train.parquet"
- split: validation
path: "data/bxr/validation.parquet"
- config_name: ca
data_files:
- split: train
path: "data/ca/train.parquet"
- split: validation
path: "data/ca/validation.parquet"
- config_name: cbk_zam
data_files:
- split: train
path: "data/cbk_zam/train.parquet"
- split: validation
path: "data/cbk_zam/validation.parquet"
- config_name: cdo
data_files:
- split: train
path: "data/cdo/train.parquet"
- split: validation
path: "data/cdo/validation.parquet"
- config_name: ce
data_files:
- split: train
path: "data/ce/train.parquet"
- split: validation
path: "data/ce/validation.parquet"
- config_name: ceb
data_files:
- split: train
path: "data/ceb/train.parquet"
- split: validation
path: "data/ceb/validation.parquet"
- config_name: ch
data_files:
- split: train
path: "data/ch/train.parquet"
- split: validation
path: "data/ch/validation.parquet"
- config_name: chr
data_files:
- split: train
path: "data/chr/train.parquet"
- split: validation
path: "data/chr/validation.parquet"
- config_name: chy
data_files:
- split: train
path: "data/chy/train.parquet"
- split: validation
path: "data/chy/validation.parquet"
- config_name: ckb
data_files:
- split: train
path: "data/ckb/train.parquet"
- split: validation
path: "data/ckb/validation.parquet"
- config_name: co
data_files:
- split: train
path: "data/co/train.parquet"
- split: validation
path: "data/co/validation.parquet"
- config_name: cr
data_files:
- split: train
path: "data/cr/train.parquet"
- split: validation
path: "data/cr/validation.parquet"
- config_name: crh
data_files:
- split: train
path: "data/crh/train.parquet"
- split: validation
path: "data/crh/validation.parquet"
- config_name: cs
data_files:
- split: train
path: "data/cs/train.parquet"
- split: validation
path: "data/cs/validation.parquet"
- config_name: csb
data_files:
- split: train
path: "data/csb/train.parquet"
- split: validation
path: "data/csb/validation.parquet"
- config_name: cu
data_files:
- split: train
path: "data/cu/train.parquet"
- split: validation
path: "data/cu/validation.parquet"
- config_name: cv
data_files:
- split: train
path: "data/cv/train.parquet"
- split: validation
path: "data/cv/validation.parquet"
- config_name: cy
data_files:
- split: train
path: "data/cy/train.parquet"
- split: validation
path: "data/cy/validation.parquet"
- config_name: da
data_files:
- split: train
path: "data/da/train.parquet"
- split: validation
path: "data/da/validation.parquet"
- config_name: dag
data_files:
- split: train
path: "data/dag/train.parquet"
- split: validation
path: "data/dag/validation.parquet"
- config_name: de
data_files:
- split: train
path: "data/de/train.parquet"
- split: validation
path: "data/de/validation.parquet"
- config_name: din
data_files:
- split: train
path: "data/din/train.parquet"
- split: validation
path: "data/din/validation.parquet"
- config_name: diq
data_files:
- split: train
path: "data/diq/train.parquet"
- split: validation
path: "data/diq/validation.parquet"
- config_name: dsb
data_files:
- split: train
path: "data/dsb/train.parquet"
- split: validation
path: "data/dsb/validation.parquet"
- config_name: dty
data_files:
- split: train
path: "data/dty/train.parquet"
- split: validation
path: "data/dty/validation.parquet"
- config_name: dv
data_files:
- split: train
path: "data/dv/train.parquet"
- split: validation
path: "data/dv/validation.parquet"
- config_name: dz
data_files:
- split: train
path: "data/dz/train.parquet"
- split: validation
path: "data/dz/validation.parquet"
- config_name: ee
data_files:
- split: train
path: "data/ee/train.parquet"
- split: validation
path: "data/ee/validation.parquet"
- config_name: el
data_files:
- split: train
path: "data/el/train.parquet"
- split: validation
path: "data/el/validation.parquet"
- config_name: eml
data_files:
- split: train
path: "data/eml/train.parquet"
- split: validation
path: "data/eml/validation.parquet"
- config_name: en
data_files:
- split: train
path: "data/en/train.parquet"
- split: validation
path: "data/en/validation.parquet"
- config_name: eo
data_files:
- split: train
path: "data/eo/train.parquet"
- split: validation
path: "data/eo/validation.parquet"
- config_name: es
data_files:
- split: train
path: "data/es/train.parquet"
- split: validation
path: "data/es/validation.parquet"
- config_name: et
data_files:
- split: train
path: "data/et/train.parquet"
- split: validation
path: "data/et/validation.parquet"
- config_name: eu
data_files:
- split: train
path: "data/eu/train.parquet"
- split: validation
path: "data/eu/validation.parquet"
- config_name: ext
data_files:
- split: train
path: "data/ext/train.parquet"
- split: validation
path: "data/ext/validation.parquet"
- config_name: fa
data_files:
- split: train
path: "data/fa/train.parquet"
- split: validation
path: "data/fa/validation.parquet"
- config_name: fat
data_files:
- split: train
path: "data/fat/train.parquet"
- split: validation
path: "data/fat/validation.parquet"
- config_name: ff
data_files:
- split: train
path: "data/ff/train.parquet"
- split: validation
path: "data/ff/validation.parquet"
- config_name: fi
data_files:
- split: train
path: "data/fi/train.parquet"
- split: validation
path: "data/fi/validation.parquet"
- config_name: fiu_vro
data_files:
- split: train
path: "data/fiu_vro/train.parquet"
- split: validation
path: "data/fiu_vro/validation.parquet"
- config_name: fj
data_files:
- split: train
path: "data/fj/train.parquet"
- split: validation
path: "data/fj/validation.parquet"
- config_name: fo
data_files:
- split: train
path: "data/fo/train.parquet"
- split: validation
path: "data/fo/validation.parquet"
- config_name: fr
data_files:
- split: train
path: "data/fr/train.parquet"
- split: validation
path: "data/fr/validation.parquet"
- config_name: frp
data_files:
- split: train
path: "data/frp/train.parquet"
- split: validation
path: "data/frp/validation.parquet"
- config_name: frr
data_files:
- split: train
path: "data/frr/train.parquet"
- split: validation
path: "data/frr/validation.parquet"
- config_name: fur
data_files:
- split: train
path: "data/fur/train.parquet"
- split: validation
path: "data/fur/validation.parquet"
- config_name: fy
data_files:
- split: train
path: "data/fy/train.parquet"
- split: validation
path: "data/fy/validation.parquet"
- config_name: ga
data_files:
- split: train
path: "data/ga/train.parquet"
- split: validation
path: "data/ga/validation.parquet"
- config_name: gag
data_files:
- split: train
path: "data/gag/train.parquet"
- split: validation
path: "data/gag/validation.parquet"
- config_name: gan
data_files:
- split: train
path: "data/gan/train.parquet"
- split: validation
path: "data/gan/validation.parquet"
- config_name: gcr
data_files:
- split: train
path: "data/gcr/train.parquet"
- split: validation
path: "data/gcr/validation.parquet"
- config_name: gd
data_files:
- split: train
path: "data/gd/train.parquet"
- split: validation
path: "data/gd/validation.parquet"
- config_name: gl
data_files:
- split: train
path: "data/gl/train.parquet"
- split: validation
path: "data/gl/validation.parquet"
- config_name: glk
data_files:
- split: train
path: "data/glk/train.parquet"
- split: validation
path: "data/glk/validation.parquet"
- config_name: gn
data_files:
- split: train
path: "data/gn/train.parquet"
- split: validation
path: "data/gn/validation.parquet"
- config_name: gom
data_files:
- split: train
path: "data/gom/train.parquet"
- split: validation
path: "data/gom/validation.parquet"
- config_name: gor
data_files:
- split: train
path: "data/gor/train.parquet"
- split: validation
path: "data/gor/validation.parquet"
- config_name: got
data_files:
- split: train
path: "data/got/train.parquet"
- split: validation
path: "data/got/validation.parquet"
- config_name: gpe
data_files:
- split: train
path: "data/gpe/train.parquet"
- split: validation
path: "data/gpe/validation.parquet"
- config_name: gu
data_files:
- split: train
path: "data/gu/train.parquet"
- split: validation
path: "data/gu/validation.parquet"
- config_name: guc
data_files:
- split: train
path: "data/guc/train.parquet"
- split: validation
path: "data/guc/validation.parquet"
- config_name: gur
data_files:
- split: train
path: "data/gur/train.parquet"
- split: validation
path: "data/gur/validation.parquet"
- config_name: guw
data_files:
- split: train
path: "data/guw/train.parquet"
- split: validation
path: "data/guw/validation.parquet"
- config_name: gv
data_files:
- split: train
path: "data/gv/train.parquet"
- split: validation
path: "data/gv/validation.parquet"
- config_name: ha
data_files:
- split: train
path: "data/ha/train.parquet"
- split: validation
path: "data/ha/validation.parquet"
- config_name: hak
data_files:
- split: train
path: "data/hak/train.parquet"
- split: validation
path: "data/hak/validation.parquet"
- config_name: haw
data_files:
- split: train
path: "data/haw/train.parquet"
- split: validation
path: "data/haw/validation.parquet"
- config_name: he
data_files:
- split: train
path: "data/he/train.parquet"
- split: validation
path: "data/he/validation.parquet"
- config_name: hi
data_files:
- split: train
path: "data/hi/train.parquet"
- split: validation
path: "data/hi/validation.parquet"
- config_name: hif
data_files:
- split: train
path: "data/hif/train.parquet"
- split: validation
path: "data/hif/validation.parquet"
- config_name: hr
data_files:
- split: train
path: "data/hr/train.parquet"
- split: validation
path: "data/hr/validation.parquet"
- config_name: hsb
data_files:
- split: train
path: "data/hsb/train.parquet"
- split: validation
path: "data/hsb/validation.parquet"
- config_name: ht
data_files:
- split: train
path: "data/ht/train.parquet"
- split: validation
path: "data/ht/validation.parquet"
- config_name: hu
data_files:
- split: train
path: "data/hu/train.parquet"
- split: validation
path: "data/hu/validation.parquet"
- config_name: hy
data_files:
- split: train
path: "data/hy/train.parquet"
- split: validation
path: "data/hy/validation.parquet"
- config_name: hyw
data_files:
- split: train
path: "data/hyw/train.parquet"
- split: validation
path: "data/hyw/validation.parquet"
- config_name: ia
data_files:
- split: train
path: "data/ia/train.parquet"
- split: validation
path: "data/ia/validation.parquet"
- config_name: id
data_files:
- split: train
path: "data/id/train.parquet"
- split: validation
path: "data/id/validation.parquet"
- config_name: ie
data_files:
- split: train
path: "data/ie/train.parquet"
- split: validation
path: "data/ie/validation.parquet"
- config_name: ig
data_files:
- split: train
path: "data/ig/train.parquet"
- split: validation
path: "data/ig/validation.parquet"
- config_name: ik
data_files:
- split: train
path: "data/ik/train.parquet"
- split: validation
path: "data/ik/validation.parquet"
- config_name: ilo
data_files:
- split: train
path: "data/ilo/train.parquet"
- split: validation
path: "data/ilo/validation.parquet"
- config_name: inh
data_files:
- split: train
path: "data/inh/train.parquet"
- split: validation
path: "data/inh/validation.parquet"
- config_name: io
data_files:
- split: train
path: "data/io/train.parquet"
- split: validation
path: "data/io/validation.parquet"
- config_name: is
data_files:
- split: train
path: "data/is/train.parquet"
- split: validation
path: "data/is/validation.parquet"
- config_name: it
data_files:
- split: train
path: "data/it/train.parquet"
- split: validation
path: "data/it/validation.parquet"
- config_name: iu
data_files:
- split: train
path: "data/iu/train.parquet"
- split: validation
path: "data/iu/validation.parquet"
- config_name: ja
data_files:
- split: train
path: "data/ja/train.parquet"
- split: validation
path: "data/ja/validation.parquet"
- config_name: jam
data_files:
- split: train
path: "data/jam/train.parquet"
- split: validation
path: "data/jam/validation.parquet"
- config_name: jbo
data_files:
- split: train
path: "data/jbo/train.parquet"
- split: validation
path: "data/jbo/validation.parquet"
- config_name: jv
data_files:
- split: train
path: "data/jv/train.parquet"
- split: validation
path: "data/jv/validation.parquet"
- config_name: ka
data_files:
- split: train
path: "data/ka/train.parquet"
- split: validation
path: "data/ka/validation.parquet"
- config_name: kaa
data_files:
- split: train
path: "data/kaa/train.parquet"
- split: validation
path: "data/kaa/validation.parquet"
- config_name: kab
data_files:
- split: train
path: "data/kab/train.parquet"
- split: validation
path: "data/kab/validation.parquet"
- config_name: kbd
data_files:
- split: train
path: "data/kbd/train.parquet"
- split: validation
path: "data/kbd/validation.parquet"
- config_name: kbp
data_files:
- split: train
path: "data/kbp/train.parquet"
- split: validation
path: "data/kbp/validation.parquet"
- config_name: kcg
data_files:
- split: train
path: "data/kcg/train.parquet"
- split: validation
path: "data/kcg/validation.parquet"
- config_name: kg
data_files:
- split: train
path: "data/kg/train.parquet"
- split: validation
path: "data/kg/validation.parquet"
- config_name: ki
data_files:
- split: train
path: "data/ki/train.parquet"
- split: validation
path: "data/ki/validation.parquet"
- config_name: kk
data_files:
- split: train
path: "data/kk/train.parquet"
- split: validation
path: "data/kk/validation.parquet"
- config_name: kl
data_files:
- split: train
path: "data/kl/train.parquet"
- split: validation
path: "data/kl/validation.parquet"
- config_name: km
data_files:
- split: train
path: "data/km/train.parquet"
- split: validation
path: "data/km/validation.parquet"
- config_name: kn
data_files:
- split: train
path: "data/kn/train.parquet"
- split: validation
path: "data/kn/validation.parquet"
- config_name: ko
data_files:
- split: train
path: "data/ko/train.parquet"
- split: validation
path: "data/ko/validation.parquet"
- config_name: koi
data_files:
- split: train
path: "data/koi/train.parquet"
- split: validation
path: "data/koi/validation.parquet"
- config_name: krc
data_files:
- split: train
path: "data/krc/train.parquet"
- split: validation
path: "data/krc/validation.parquet"
- config_name: ks
data_files:
- split: train
path: "data/ks/train.parquet"
- split: validation
path: "data/ks/validation.parquet"
- config_name: ksh
data_files:
- split: train
path: "data/ksh/train.parquet"
- split: validation
path: "data/ksh/validation.parquet"
- config_name: ku
data_files:
- split: train
path: "data/ku/train.parquet"
- split: validation
path: "data/ku/validation.parquet"
- config_name: kv
data_files:
- split: train
path: "data/kv/train.parquet"
- split: validation
path: "data/kv/validation.parquet"
- config_name: kw
data_files:
- split: train
path: "data/kw/train.parquet"
- split: validation
path: "data/kw/validation.parquet"
- config_name: ky
data_files:
- split: train
path: "data/ky/train.parquet"
- split: validation
path: "data/ky/validation.parquet"
- config_name: la
data_files:
- split: train
path: "data/la/train.parquet"
- split: validation
path: "data/la/validation.parquet"
- config_name: lad
data_files:
- split: train
path: "data/lad/train.parquet"
- split: validation
path: "data/lad/validation.parquet"
- config_name: lb
data_files:
- split: train
path: "data/lb/train.parquet"
- split: validation
path: "data/lb/validation.parquet"
- config_name: lbe
data_files:
- split: train
path: "data/lbe/train.parquet"
- split: validation
path: "data/lbe/validation.parquet"
- config_name: lez
data_files:
- split: train
path: "data/lez/train.parquet"
- split: validation
path: "data/lez/validation.parquet"
- config_name: lfn
data_files:
- split: train
path: "data/lfn/train.parquet"
- split: validation
path: "data/lfn/validation.parquet"
- config_name: lg
data_files:
- split: train
path: "data/lg/train.parquet"
- split: validation
path: "data/lg/validation.parquet"
- config_name: li
data_files:
- split: train
path: "data/li/train.parquet"
- split: validation
path: "data/li/validation.parquet"
- config_name: lij
data_files:
- split: train
path: "data/lij/train.parquet"
- split: validation
path: "data/lij/validation.parquet"
- config_name: lld
data_files:
- split: train
path: "data/lld/train.parquet"
- split: validation
path: "data/lld/validation.parquet"
- config_name: lmo
data_files:
- split: train
path: "data/lmo/train.parquet"
- split: validation
path: "data/lmo/validation.parquet"
- config_name: ln
data_files:
- split: train
path: "data/ln/train.parquet"
- split: validation
path: "data/ln/validation.parquet"
- config_name: lo
data_files:
- split: train
path: "data/lo/train.parquet"
- split: validation
path: "data/lo/validation.parquet"
- config_name: lt
data_files:
- split: train
path: "data/lt/train.parquet"
- split: validation
path: "data/lt/validation.parquet"
- config_name: ltg
data_files:
- split: train
path: "data/ltg/train.parquet"
- split: validation
path: "data/ltg/validation.parquet"
- config_name: lv
data_files:
- split: train
path: "data/lv/train.parquet"
- split: validation
path: "data/lv/validation.parquet"
- config_name: mad
data_files:
- split: train
path: "data/mad/train.parquet"
- split: validation
path: "data/mad/validation.parquet"
- config_name: mai
data_files:
- split: train
path: "data/mai/train.parquet"
- split: validation
path: "data/mai/validation.parquet"
- config_name: map_bms
data_files:
- split: train
path: "data/map_bms/train.parquet"
- split: validation
path: "data/map_bms/validation.parquet"
- config_name: mdf
data_files:
- split: train
path: "data/mdf/train.parquet"
- split: validation
path: "data/mdf/validation.parquet"
- config_name: mg
data_files:
- split: train
path: "data/mg/train.parquet"
- split: validation
path: "data/mg/validation.parquet"
- config_name: mhr
data_files:
- split: train
path: "data/mhr/train.parquet"
- split: validation
path: "data/mhr/validation.parquet"
- config_name: mi
data_files:
- split: train
path: "data/mi/train.parquet"
- split: validation
path: "data/mi/validation.parquet"
- config_name: min
data_files:
- split: train
path: "data/min/train.parquet"
- split: validation
path: "data/min/validation.parquet"
- config_name: mk
data_files:
- split: train
path: "data/mk/train.parquet"
- split: validation
path: "data/mk/validation.parquet"
- config_name: ml
data_files:
- split: train
path: "data/ml/train.parquet"
- split: validation
path: "data/ml/validation.parquet"
- config_name: mn
data_files:
- split: train
path: "data/mn/train.parquet"
- split: validation
path: "data/mn/validation.parquet"
- config_name: mni
data_files:
- split: train
path: "data/mni/train.parquet"
- split: validation
path: "data/mni/validation.parquet"
- config_name: mnw
data_files:
- split: train
path: "data/mnw/train.parquet"
- split: validation
path: "data/mnw/validation.parquet"
- config_name: mr
data_files:
- split: train
path: "data/mr/train.parquet"
- split: validation
path: "data/mr/validation.parquet"
- config_name: mrj
data_files:
- split: train
path: "data/mrj/train.parquet"
- split: validation
path: "data/mrj/validation.parquet"
- config_name: ms
data_files:
- split: train
path: "data/ms/train.parquet"
- split: validation
path: "data/ms/validation.parquet"
- config_name: mt
data_files:
- split: train
path: "data/mt/train.parquet"
- split: validation
path: "data/mt/validation.parquet"
- config_name: mwl
data_files:
- split: train
path: "data/mwl/train.parquet"
- split: validation
path: "data/mwl/validation.parquet"
- config_name: my
data_files:
- split: train
path: "data/my/train.parquet"
- split: validation
path: "data/my/validation.parquet"
- config_name: myv
data_files:
- split: train
path: "data/myv/train.parquet"
- split: validation
path: "data/myv/validation.parquet"
- config_name: mzn
data_files:
- split: train
path: "data/mzn/train.parquet"
- split: validation
path: "data/mzn/validation.parquet"
- config_name: nah
data_files:
- split: train
path: "data/nah/train.parquet"
- split: validation
path: "data/nah/validation.parquet"
- config_name: nap
data_files:
- split: train
path: "data/nap/train.parquet"
- split: validation
path: "data/nap/validation.parquet"
- config_name: nds
data_files:
- split: train
path: "data/nds/train.parquet"
- split: validation
path: "data/nds/validation.parquet"
- config_name: nds_nl
data_files:
- split: train
path: "data/nds_nl/train.parquet"
- split: validation
path: "data/nds_nl/validation.parquet"
- config_name: ne
data_files:
- split: train
path: "data/ne/train.parquet"
- split: validation
path: "data/ne/validation.parquet"
- config_name: new
data_files:
- split: train
path: "data/new/train.parquet"
- split: validation
path: "data/new/validation.parquet"
- config_name: nia
data_files:
- split: train
path: "data/nia/train.parquet"
- split: validation
path: "data/nia/validation.parquet"
- config_name: nl
data_files:
- split: train
path: "data/nl/train.parquet"
- split: validation
path: "data/nl/validation.parquet"
- config_name: nn
data_files:
- split: train
path: "data/nn/train.parquet"
- split: validation
path: "data/nn/validation.parquet"
- config_name: 'no'
data_files:
- split: train
path: "data/no/train.parquet"
- split: validation
path: "data/no/validation.parquet"
- config_name: nov
data_files:
- split: train
path: "data/nov/train.parquet"
- split: validation
path: "data/nov/validation.parquet"
- config_name: nqo
data_files:
- split: train
path: "data/nqo/train.parquet"
- split: validation
path: "data/nqo/validation.parquet"
- config_name: nrm
data_files:
- split: train
path: "data/nrm/train.parquet"
- split: validation
path: "data/nrm/validation.parquet"
- config_name: nso
data_files:
- split: train
path: "data/nso/train.parquet"
- split: validation
path: "data/nso/validation.parquet"
- config_name: nv
data_files:
- split: train
path: "data/nv/train.parquet"
- split: validation
path: "data/nv/validation.parquet"
- config_name: ny
data_files:
- split: train
path: "data/ny/train.parquet"
- split: validation
path: "data/ny/validation.parquet"
- config_name: oc
data_files:
- split: train
path: "data/oc/train.parquet"
- split: validation
path: "data/oc/validation.parquet"
- config_name: olo
data_files:
- split: train
path: "data/olo/train.parquet"
- split: validation
path: "data/olo/validation.parquet"
- config_name: om
data_files:
- split: train
path: "data/om/train.parquet"
- split: validation
path: "data/om/validation.parquet"
- config_name: or
data_files:
- split: train
path: "data/or/train.parquet"
- split: validation
path: "data/or/validation.parquet"
- config_name: os
data_files:
- split: train
path: "data/os/train.parquet"
- split: validation
path: "data/os/validation.parquet"
- config_name: pa
data_files:
- split: train
path: "data/pa/train.parquet"
- split: validation
path: "data/pa/validation.parquet"
- config_name: pag
data_files:
- split: train
path: "data/pag/train.parquet"
- split: validation
path: "data/pag/validation.parquet"
- config_name: pam
data_files:
- split: train
path: "data/pam/train.parquet"
- split: validation
path: "data/pam/validation.parquet"
- config_name: pap
data_files:
- split: train
path: "data/pap/train.parquet"
- split: validation
path: "data/pap/validation.parquet"
- config_name: pcd
data_files:
- split: train
path: "data/pcd/train.parquet"
- split: validation
path: "data/pcd/validation.parquet"
- config_name: pcm
data_files:
- split: train
path: "data/pcm/train.parquet"
- split: validation
path: "data/pcm/validation.parquet"
- config_name: pdc
data_files:
- split: train
path: "data/pdc/train.parquet"
- split: validation
path: "data/pdc/validation.parquet"
- config_name: pfl
data_files:
- split: train
path: "data/pfl/train.parquet"
- split: validation
path: "data/pfl/validation.parquet"
- config_name: pi
data_files:
- split: train
path: "data/pi/train.parquet"
- split: validation
path: "data/pi/validation.parquet"
- config_name: pih
data_files:
- split: train
path: "data/pih/train.parquet"
- split: validation
path: "data/pih/validation.parquet"
- config_name: pl
data_files:
- split: train
path: "data/pl/train.parquet"
- split: validation
path: "data/pl/validation.parquet"
- config_name: pms
data_files:
- split: train
path: "data/pms/train.parquet"
- split: validation
path: "data/pms/validation.parquet"
- config_name: pnb
data_files:
- split: train
path: "data/pnb/train.parquet"
- split: validation
path: "data/pnb/validation.parquet"
- config_name: pnt
data_files:
- split: train
path: "data/pnt/train.parquet"
- split: validation
path: "data/pnt/validation.parquet"
- config_name: ps
data_files:
- split: train
path: "data/ps/train.parquet"
- split: validation
path: "data/ps/validation.parquet"
- config_name: pt
data_files:
- split: train
path: "data/pt/train.parquet"
- split: validation
path: "data/pt/validation.parquet"
- config_name: pwn
data_files:
- split: train
path: "data/pwn/train.parquet"
- split: validation
path: "data/pwn/validation.parquet"
- config_name: qu
data_files:
- split: train
path: "data/qu/train.parquet"
- split: validation
path: "data/qu/validation.parquet"
- config_name: rm
data_files:
- split: train
path: "data/rm/train.parquet"
- split: validation
path: "data/rm/validation.parquet"
- config_name: rmy
data_files:
- split: train
path: "data/rmy/train.parquet"
- split: validation
path: "data/rmy/validation.parquet"
- config_name: rn
data_files:
- split: train
path: "data/rn/train.parquet"
- split: validation
path: "data/rn/validation.parquet"
- config_name: ro
data_files:
- split: train
path: "data/ro/train.parquet"
- split: validation
path: "data/ro/validation.parquet"
- config_name: roa_rup
data_files:
- split: train
path: "data/roa_rup/train.parquet"
- split: validation
path: "data/roa_rup/validation.parquet"
- config_name: roa_tara
data_files:
- split: train
path: "data/roa_tara/train.parquet"
- split: validation
path: "data/roa_tara/validation.parquet"
- config_name: ru
data_files:
- split: train
path: "data/ru/train.parquet"
- split: validation
path: "data/ru/validation.parquet"
- config_name: rue
data_files:
- split: train
path: "data/rue/train.parquet"
- split: validation
path: "data/rue/validation.parquet"
- config_name: rw
data_files:
- split: train
path: "data/rw/train.parquet"
- split: validation
path: "data/rw/validation.parquet"
- config_name: sa
data_files:
- split: train
path: "data/sa/train.parquet"
- split: validation
path: "data/sa/validation.parquet"
- config_name: sah
data_files:
- split: train
path: "data/sah/train.parquet"
- split: validation
path: "data/sah/validation.parquet"
- config_name: sat
data_files:
- split: train
path: "data/sat/train.parquet"
- split: validation
path: "data/sat/validation.parquet"
- config_name: sc
data_files:
- split: train
path: "data/sc/train.parquet"
- split: validation
path: "data/sc/validation.parquet"
- config_name: scn
data_files:
- split: train
path: "data/scn/train.parquet"
- split: validation
path: "data/scn/validation.parquet"
- config_name: sco
data_files:
- split: train
path: "data/sco/train.parquet"
- split: validation
path: "data/sco/validation.parquet"
- config_name: sd
data_files:
- split: train
path: "data/sd/train.parquet"
- split: validation
path: "data/sd/validation.parquet"
- config_name: se
data_files:
- split: train
path: "data/se/train.parquet"
- split: validation
path: "data/se/validation.parquet"
- config_name: sg
data_files:
- split: train
path: "data/sg/train.parquet"
- split: validation
path: "data/sg/validation.parquet"
- config_name: sh
data_files:
- split: train
path: "data/sh/train.parquet"
- split: validation
path: "data/sh/validation.parquet"
- config_name: shi
data_files:
- split: train
path: "data/shi/train.parquet"
- split: validation
path: "data/shi/validation.parquet"
- config_name: shn
data_files:
- split: train
path: "data/shn/train.parquet"
- split: validation
path: "data/shn/validation.parquet"
- config_name: si
data_files:
- split: train
path: "data/si/train.parquet"
- split: validation
path: "data/si/validation.parquet"
- config_name: simple
data_files:
- split: train
path: "data/simple/train.parquet"
- split: validation
path: "data/simple/validation.parquet"
- config_name: sk
data_files:
- split: train
path: "data/sk/train.parquet"
- split: validation
path: "data/sk/validation.parquet"
- config_name: skr
data_files:
- split: train
path: "data/skr/train.parquet"
- split: validation
path: "data/skr/validation.parquet"
- config_name: sl
data_files:
- split: train
path: "data/sl/train.parquet"
- split: validation
path: "data/sl/validation.parquet"
- config_name: sm
data_files:
- split: train
path: "data/sm/train.parquet"
- split: validation
path: "data/sm/validation.parquet"
- config_name: smn
data_files:
- split: train
path: "data/smn/train.parquet"
- split: validation
path: "data/smn/validation.parquet"
- config_name: sn
data_files:
- split: train
path: "data/sn/train.parquet"
- split: validation
path: "data/sn/validation.parquet"
- config_name: so
data_files:
- split: train
path: "data/so/train.parquet"
- split: validation
path: "data/so/validation.parquet"
- config_name: sq
data_files:
- split: train
path: "data/sq/train.parquet"
- split: validation
path: "data/sq/validation.parquet"
- config_name: sr
data_files:
- split: train
path: "data/sr/train.parquet"
- split: validation
path: "data/sr/validation.parquet"
- config_name: srn
data_files:
- split: train
path: "data/srn/train.parquet"
- split: validation
path: "data/srn/validation.parquet"
- config_name: ss
data_files:
- split: train
path: "data/ss/train.parquet"
- split: validation
path: "data/ss/validation.parquet"
- config_name: st
data_files:
- split: train
path: "data/st/train.parquet"
- split: validation
path: "data/st/validation.parquet"
- config_name: stq
data_files:
- split: train
path: "data/stq/train.parquet"
- split: validation
path: "data/stq/validation.parquet"
- config_name: su
data_files:
- split: train
path: "data/su/train.parquet"
- split: validation
path: "data/su/validation.parquet"
- config_name: sv
data_files:
- split: train
path: "data/sv/train.parquet"
- split: validation
path: "data/sv/validation.parquet"
- config_name: sw
data_files:
- split: train
path: "data/sw/train.parquet"
- split: validation
path: "data/sw/validation.parquet"
- config_name: szl
data_files:
- split: train
path: "data/szl/train.parquet"
- split: validation
path: "data/szl/validation.parquet"
- config_name: szy
data_files:
- split: train
path: "data/szy/train.parquet"
- split: validation
path: "data/szy/validation.parquet"
- config_name: ta
data_files:
- split: train
path: "data/ta/train.parquet"
- split: validation
path: "data/ta/validation.parquet"
- config_name: tay
data_files:
- split: train
path: "data/tay/train.parquet"
- split: validation
path: "data/tay/validation.parquet"
- config_name: tcy
data_files:
- split: train
path: "data/tcy/train.parquet"
- split: validation
path: "data/tcy/validation.parquet"
- config_name: te
data_files:
- split: train
path: "data/te/train.parquet"
- split: validation
path: "data/te/validation.parquet"
- config_name: tet
data_files:
- split: train
path: "data/tet/train.parquet"
- split: validation
path: "data/tet/validation.parquet"
- config_name: tg
data_files:
- split: train
path: "data/tg/train.parquet"
- split: validation
path: "data/tg/validation.parquet"
- config_name: th
data_files:
- split: train
path: "data/th/train.parquet"
- split: validation
path: "data/th/validation.parquet"
- config_name: ti
data_files:
- split: train
path: "data/ti/train.parquet"
- split: validation
path: "data/ti/validation.parquet"
- config_name: tk
data_files:
- split: train
path: "data/tk/train.parquet"
- split: validation
path: "data/tk/validation.parquet"
- config_name: tl
data_files:
- split: train
path: "data/tl/train.parquet"
- split: validation
path: "data/tl/validation.parquet"
- config_name: tn
data_files:
- split: train
path: "data/tn/train.parquet"
- split: validation
path: "data/tn/validation.parquet"
- config_name: to
data_files:
- split: train
path: "data/to/train.parquet"
- split: validation
path: "data/to/validation.parquet"
- config_name: tpi
data_files:
- split: train
path: "data/tpi/train.parquet"
- split: validation
path: "data/tpi/validation.parquet"
- config_name: tr
data_files:
- split: train
path: "data/tr/train.parquet"
- split: validation
path: "data/tr/validation.parquet"
- config_name: trv
data_files:
- split: train
path: "data/trv/train.parquet"
- split: validation
path: "data/trv/validation.parquet"
- config_name: ts
data_files:
- split: train
path: "data/ts/train.parquet"
- split: validation
path: "data/ts/validation.parquet"
- config_name: tt
data_files:
- split: train
path: "data/tt/train.parquet"
- split: validation
path: "data/tt/validation.parquet"
- config_name: tum
data_files:
- split: train
path: "data/tum/train.parquet"
- split: validation
path: "data/tum/validation.parquet"
- config_name: tw
data_files:
- split: train
path: "data/tw/train.parquet"
- split: validation
path: "data/tw/validation.parquet"
- config_name: ty
data_files:
- split: train
path: "data/ty/train.parquet"
- split: validation
path: "data/ty/validation.parquet"
- config_name: tyv
data_files:
- split: train
path: "data/tyv/train.parquet"
- split: validation
path: "data/tyv/validation.parquet"
- config_name: udm
data_files:
- split: train
path: "data/udm/train.parquet"
- split: validation
path: "data/udm/validation.parquet"
- config_name: ug
data_files:
- split: train
path: "data/ug/train.parquet"
- split: validation
path: "data/ug/validation.parquet"
- config_name: uk
data_files:
- split: train
path: "data/uk/train.parquet"
- split: validation
path: "data/uk/validation.parquet"
- config_name: ur
data_files:
- split: train
path: "data/ur/train.parquet"
- split: validation
path: "data/ur/validation.parquet"
- config_name: uz
data_files:
- split: train
path: "data/uz/train.parquet"
- split: validation
path: "data/uz/validation.parquet"
- config_name: ve
data_files:
- split: train
path: "data/ve/train.parquet"
- split: validation
path: "data/ve/validation.parquet"
- config_name: vec
data_files:
- split: train
path: "data/vec/train.parquet"
- split: validation
path: "data/vec/validation.parquet"
- config_name: vep
data_files:
- split: train
path: "data/vep/train.parquet"
- split: validation
path: "data/vep/validation.parquet"
- config_name: vi
data_files:
- split: train
path: "data/vi/train.parquet"
- split: validation
path: "data/vi/validation.parquet"
- config_name: vls
data_files:
- split: train
path: "data/vls/train.parquet"
- split: validation
path: "data/vls/validation.parquet"
- config_name: vo
data_files:
- split: train
path: "data/vo/train.parquet"
- split: validation
path: "data/vo/validation.parquet"
- config_name: wa
data_files:
- split: train
path: "data/wa/train.parquet"
- split: validation
path: "data/wa/validation.parquet"
- config_name: war
data_files:
- split: train
path: "data/war/train.parquet"
- split: validation
path: "data/war/validation.parquet"
- config_name: wo
data_files:
- split: train
path: "data/wo/train.parquet"
- split: validation
path: "data/wo/validation.parquet"
- config_name: wuu
data_files:
- split: train
path: "data/wuu/train.parquet"
- split: validation
path: "data/wuu/validation.parquet"
- config_name: xal
data_files:
- split: train
path: "data/xal/train.parquet"
- split: validation
path: "data/xal/validation.parquet"
- config_name: xh
data_files:
- split: train
path: "data/xh/train.parquet"
- split: validation
path: "data/xh/validation.parquet"
- config_name: xmf
data_files:
- split: train
path: "data/xmf/train.parquet"
- split: validation
path: "data/xmf/validation.parquet"
- config_name: yi
data_files:
- split: train
path: "data/yi/train.parquet"
- split: validation
path: "data/yi/validation.parquet"
- config_name: yo
data_files:
- split: train
path: "data/yo/train.parquet"
- split: validation
path: "data/yo/validation.parquet"
- config_name: za
data_files:
- split: train
path: "data/za/train.parquet"
- split: validation
path: "data/za/validation.parquet"
- config_name: zea
data_files:
- split: train
path: "data/zea/train.parquet"
- split: validation
path: "data/zea/validation.parquet"
- config_name: zh
data_files:
- split: train
path: "data/zh/train.parquet"
- split: validation
path: "data/zh/validation.parquet"
- config_name: zh_classical
data_files:
- split: train
path: "data/zh_classical/train.parquet"
- split: validation
path: "data/zh_classical/validation.parquet"
- config_name: zh_min_nan
data_files:
- split: train
path: "data/zh_min_nan/train.parquet"
- split: validation
path: "data/zh_min_nan/validation.parquet"
- config_name: zh_yue
data_files:
- split: train
path: "data/zh_yue/train.parquet"
- split: validation
path: "data/zh_yue/validation.parquet"
- config_name: zu
data_files:
- split: train
path: "data/zu/train.parquet"
- split: validation
path: "data/zu/validation.parquet"
---
# Dataset Card for WikiAnc
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Repository:** [WikiAnc repository](https://github.com/cyanic-selkie/wikianc)
### Dataset Summary
The WikiAnc dataset is an automatically generated dataset from Wikipedia (all languages) and Wikidata dumps (August, 2023).
The code for generating the dataset can be found [here](https://github.com/cyanic-selkie/wikianc).
### Supported Tasks
- `wikificiation`: The dataset can be used to train a model for Wikification.
- `named-entity-linking`: The dataset can be used to train a model for Named Entity Linking.
### Languages
The text in the dataset is in all 320 Wikipedia languages. The full list can be found in the table below.
## Dataset Structure
### Data Instances
A typical data point represents a paragraph in a Wikipedia article.
The `paragraph_text` field contains the original text in an NFC normalized, UTF-8 encoded string.
The `paragraph_anchors` field contains a list of anchors, each represented by a struct with the inclusive starting UTF-8 code point `start` field, exclusive ending UTF-8 code point `end` field, a nullable `qid` field, a nullable `pageid` field, and an NFC normalized, UTF-8 encoded `title` (Wikipedia) field.
Additionally, each paragraph has `article_title`, `article_pageid`, and (nullable) `article_qid` fields referring to the article the paragraph came from.
There is also a nullable, NFC normalized, UTF-8 encoded `section_heading` field, and an integer `section_level` field referring to the heading (if it exists) of the article section, and the level in the section hierarchy that the paragraph came from.
The `qid` fields refers to Wikidata's QID identifiers, while the `pageid` and `title` fields refer to Wikipedia's pageID and title identifiers (there is a one-to-one mapping between pageIDs and titles).
**NOTE:** An anchor will always have a `title`, but that doesn't mean it has to have a `pageid`. This is because Wikipedia allows defining anchors to nonexistent articles.
An example from the WikiAnc EN test set looks as follows:
```
{
"uuid": "5f74e678-944f-4761-a5e0-b6426f6f61b8",
"article_title": "Climatius",
"article_pageid": 5394373,
"article_qid": 867987,
"section_heading": null,
"section_level": 0,
"paragraph_text": "It was a small fish, at 7.5 cm, and to discourage predators, Climatius sported fifteen sharp spines. There was one spine each on the paired pelvic and pectoral fins, and on the aingle anal and two dorsal fins, and a four pairs without fins on the fish's underside.",
"paragraph_anchors": [
{
"start": 140,
"end": 146,
"qid": 3335089,
"pageid": 56849833,
"title": "Pelvic_fin"
},
{
"start": 151,
"end": 159,
"qid": 4162555,
"pageid": 331956,
"title": "Pectoral_fin"
},
{
"start": 184,
"end": 188,
"qid": 4162555,
"pageid": 331958,
"title": "Anal_fin"
},
{
"start": 197,
"end": 208,
"qid": 1568355,
"pageid": 294244,
"title": "Dorsal_fin"
}
]
}
```
### Data Fields
- `uuid`: a UTF-8 encoded string representing a v4 UUID that uniquely identifies the example
- `article_title`: an NFC normalized, UTF-8 encoded Wikipedia title of the article; spaces are replaced with underscores
- `article_pageid`: an integer representing the Wikipedia pageID of the article
- `article_qid`: an integer representing the Wikidata QID this article refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `section_heading`: a nullable, NFC normalized, UTF-8 encoded string representing the section heading
- `section_level`: an integer representing the level of the section in the section hierarchy
- `paragraph_text`: an NFC normalized, UTF-8 encoded string representing the paragraph
- `paragraph_anchors`: a list of structs representing anchors, each anchor has:
- `start`: an integer representing the inclusive starting UTF-8 code point of the anchors
- `end`: an integer representing the exclusive ending UTF-8 code point of the anchor
- `qid`: a nullable integer representing the Wikidata QID this anchor refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `pageid`: a nullable integer representing the Wikipedia pageID of the anchor; it can be null if the article didn't exist in Wikipedia at the time of the creation of the original dataset
- `title`: an NFC normalized, UTF-8 encoded string representing the Wikipedia title of the anchor; spaces are replaced with underscores; can refer to a nonexistent Wikipedia article
### Data Splits
The data is split into training, validation and test sets; paragraphs belonging to the same article aren't necessarily in the same split. The final split sizes are as follows:
#### Train
| | Articles | Paragraphs | Anchors | Anchors with QIDs | Anchors with PageIDs |
| :-- | --: | --: | --: | --: | --: |
| ab | 2378 | 5678 | 10515 | 3649 | 3650 |
| ace | 12591 | 23969 | 48638 | 25150 | 25175 |
| ady | 596 | 1662 | 2694 | 1593 | 1606 |
| af | 104470 | 399038 | 985640 | 900596 | 900967 |
| als | 27999 | 165085 | 402049 | 294742 | 294744 |
| alt | 1043 | 7468 | 9158 | 5446 | 5452 |
| am | 13576 | 46318 | 90051 | 51915 | 52173 |
| ami | 1582 | 12428 | 6080 | 1505 | 2579 |
| an | 40179 | 121367 | 669830 | 516248 | 516822 |
| ang | 3833 | 9664 | 24297 | 10189 | 10229 |
| anp | 2506 | 6865 | 14560 | 3825 | 5061 |
| ar | 1132271 | 3617491 | 11657228 | 11240112 | 11244160 |
| arc | 1844 | 3766 | 9232 | 5460 | 5545 |
| ary | 6736 | 17049 | 50185 | 34193 | 34227 |
| arz | 1579782 | 3693549 | 7879303 | 6906799 | 6917393 |
| as | 11947 | 77835 | 122760 | 67594 | 67720 |
| ast | 126992 | 877278 | 2952000 | 1775764 | 1777383 |
| atj | 1872 | 3820 | 6544 | 3247 | 3365 |
| av | 3048 | 8542 | 16115 | 8895 | 9000 |
| avk | 27577 | 85219 | 106100 | 32260 | 33491 |
| awa | 3396 | 5802 | 6617 | 1679 | 2370 |
| ay | 5102 | 15125 | 22802 | 13930 | 13933 |
| az | 180810 | 789902 | 1570889 | 1377797 | 1380325 |
| azb | 240990 | 585386 | 1241661 | 749575 | 753318 |
| ba | 62269 | 391926 | 625645 | 562730 | 563181 |
| ban | 18955 | 44138 | 86239 | 66213 | 66412 |
| bar | 26057 | 83298 | 185158 | 109082 | 109091 |
| bat_smg | 17013 | 41951 | 77417 | 51701 | 51733 |
| bcl | 13783 | 45457 | 78963 | 47819 | 47861 |
| be | 222883 | 821135 | 2499258 | 2204062 | 2204117 |
| bg | 285156 | 1336530 | 3967713 | 3618800 | 3627798 |
| bh | 7658 | 17052 | 29110 | 22157 | 22217 |
| bi | 1403 | 1712 | 3172 | 1991 | 1995 |
| bjn | 9672 | 19007 | 58660 | 32538 | 33071 |
| blk | 2786 | 11825 | 11341 | 5979 | 6129 |
| bm | 1111 | 2421 | 2451 | 1217 | 1218 |
| bn | 136921 | 736388 | 1530942 | 1161967 | 1162761 |
| bo | 11843 | 37121 | 8241 | 6265 | 6359 |
| bpy | 24742 | 115606 | 166906 | 86166 | 86170 |
| br | 78524 | 214128 | 657375 | 527295 | 527606 |
| bs | 86407 | 382114 | 1246030 | 965782 | 966511 |
| bug | 14231 | 14484 | 53879 | 14787 | 15146 |
| bxr | 2730 | 9571 | 27853 | 11560 | 11567 |
| ca | 691444 | 3596667 | 11359870 | 10236358 | 10237666 |
| cbk_zam | 2989 | 8322 | 9939 | 2790 | 2847 |
| cdo | 15922 | 30059 | 63474 | 29659 | 29705 |
| ce | 597137 | 2121587 | 3097393 | 1507129 | 1507806 |
| ceb | 5888811 | 11920613 | 37969424 | 33678489 | 33962205 |
| ch | 574 | 1166 | 2290 | 492 | 601 |
| chr | 980 | 1110 | 1311 | 779 | 790 |
| chy | 711 | 753 | 494 | 428 | 428 |
| ckb | 48903 | 163599 | 435662 | 224749 | 226749 |
| co | 6719 | 22954 | 46391 | 24149 | 24229 |
| cr | 158 | 216 | 209 | 94 | 94 |
| crh | 24117 | 29781 | 98534 | 70231 | 70235 |
| cs | 516037 | 2679537 | 9917806 | 8763103 | 8763291 |
| csb | 5315 | 14009 | 31294 | 16820 | 16820 |
| cu | 1171 | 2796 | 5283 | 2346 | 2349 |
| cv | 50525 | 157542 | 375399 | 166889 | 167497 |
| cy | 276031 | 992900 | 2011030 | 1613064 | 1620632 |
| da | 284765 | 1167917 | 4352733 | 3854239 | 3854549 |
| dag | 9248 | 29213 | 46084 | 10981 | 14213 |
| de | 2780056 | 16093948 | 52497421 | 50480495 | 50480548 |
| din | 485 | 1551 | 1096 | 197 | 197 |
| diq | 37565 | 70969 | 155656 | 141636 | 141695 |
| dsb | 3083 | 8760 | 19397 | 9652 | 9652 |
| dty | 3339 | 6219 | 7505 | 4417 | 4447 |
| dv | 4190 | 16809 | 7906 | 3612 | 3620 |
| dz | 652 | 2623 | 272 | 94 | 100 |
| ee | 1075 | 2326 | 1823 | 861 | 926 |
| el | 224207 | 1527561 | 4181433 | 3119952 | 3121967 |
| eml | 12169 | 53861 | 115729 | 65775 | 65940 |
| en | 6514924 | 40656507 | 109681826 | 107761324 | 107768438 |
| eo | 330486 | 1116191 | 4257655 | 3975927 | 3979379 |
| es | 1792062 | 10890435 | 33729712 | 31581851 | 31648945 |
| et | 233078 | 1110906 | 3558448 | 2879595 | 2886824 |
| eu | 386029 | 1405747 | 3398477 | 3025183 | 3030635 |
| ext | 3472 | 9626 | 20554 | 11966 | 11978 |
| fa | 901254 | 2357271 | 6189352 | 5862106 | 5870803 |
| fat | 1044 | 6092 | 1717 | 120 | 857 |
| ff | 1763 | 4103 | 3483 | 2304 | 2413 |
| fi | 373226 | 1667296 | 5221239 | 4658292 | 4663471 |
| fiu_vro | 6417 | 19897 | 40418 | 23563 | 23609 |
| fj | 1157 | 1782 | 4852 | 1910 | 1911 |
| fo | 11809 | 30828 | 119267 | 95117 | 95259 |
| fr | 2432972 | 15252697 | 43564517 | 42573624 | 42589064 |
| frp | 5341 | 10574 | 36358 | 24905 | 24926 |
| frr | 16038 | 30821 | 80265 | 68184 | 68315 |
| fur | 3665 | 10651 | 29516 | 16249 | 16278 |
| fy | 46011 | 206153 | 1271339 | 985227 | 985511 |
| ga | 52168 | 130535 | 347037 | 288261 | 288309 |
| gag | 2408 | 4844 | 8551 | 4520 | 4520 |
| gan | 4219 | 9689 | 18994 | 14119 | 14128 |
| gcr | 2227 | 5163 | 2763 | 1186 | 1186 |
| gd | 15850 | 48217 | 141290 | 95557 | 95562 |
| gl | 190419 | 910543 | 3674404 | 2937660 | 2938634 |
| glk | 6484 | 15344 | 32631 | 21395 | 21447 |
| gn | 5064 | 15481 | 40641 | 30389 | 30440 |
| gom | 4192 | 37508 | 14192 | 2369 | 2382 |
| gor | 14388 | 28133 | 107341 | 66191 | 67016 |
| got | 960 | 2186 | 4093 | 1404 | 1415 |
| gpe | 899 | 3383 | 1199 | 796 | 815 |
| gu | 30025 | 114805 | 459063 | 348651 | 348731 |
| guc | 546 | 2545 | 2300 | 1025 | 1138 |
| gur | 1010 | 5043 | 1761 | 227 | 244 |
| guw | 1263 | 3719 | 7474 | 3116 | 5375 |
| gv | 5036 | 12213 | 48801 | 19659 | 19663 |
| ha | 31977 | 149096 | 115029 | 97167 | 98184 |
| hak | 8694 | 11505 | 39744 | 28150 | 28152 |
| haw | 2470 | 5810 | 11169 | 5700 | 5705 |
| he | 323472 | 2648617 | 10904148 | 10367532 | 10379886 |
| hi | 150121 | 538451 | 964251 | 795726 | 798254 |
| hif | 10534 | 21169 | 43463 | 23970 | 24316 |
| hr | 189415 | 876107 | 3210326 | 2752205 | 2758602 |
| hsb | 13183 | 40760 | 91863 | 66632 | 66633 |
| ht | 64850 | 154160 | 201547 | 166206 | 167961 |
| hu | 346711 | 1859683 | 5267990 | 4707580 | 4710525 |
| hy | 298066 | 1542920 | 3767938 | 2689014 | 2690466 |
| hyw | 11358 | 83640 | 161227 | 82218 | 84817 |
| ia | 24581 | 43289 | 129914 | 96517 | 96595 |
| id | 620895 | 2138237 | 6589957 | 5629372 | 5644832 |
| ie | 11020 | 22342 | 60890 | 46054 | 46122 |
| ig | 19448 | 110907 | 57963 | 31022 | 31298 |
| ik | 737 | 1016 | 848 | 551 | 580 |
| ilo | 14135 | 74304 | 126533 | 75701 | 75705 |
| inh | 1754 | 4640 | 13284 | 5770 | 6011 |
| io | 36312 | 101555 | 303765 | 258933 | 259001 |
| is | 54348 | 170321 | 574897 | 436767 | 437784 |
| it | 1610989 | 8718610 | 27447754 | 26116131 | 26126157 |
| iu | 502 | 757 | 536 | 414 | 418 |
| ja | 1355269 | 9276459 | 29002111 | 27752954 | 27801000 |
| jam | 1571 | 2260 | 5887 | 3588 | 3590 |
| jbo | 1287 | 3088 | 5831 | 546 | 546 |
| jv | 66323 | 148710 | 547010 | 381682 | 382052 |
| ka | 167161 | 695865 | 2275552 | 422090 | 422095 |
| kaa | 3540 | 9814 | 12930 | 5312 | 5752 |
| kab | 5346 | 14709 | 36889 | 22000 | 22050 |
| kbd | 1549 | 6348 | 14594 | 5277 | 5280 |
| kbp | 1846 | 6005 | 7119 | 6875 | 6880 |
| kcg | 871 | 1839 | 2953 | 1857 | 1871 |
| kg | 1187 | 1933 | 3835 | 2292 | 2295 |
| ki | 1482 | 2899 | 2035 | 1386 | 1649 |
| kk | 235740 | 889990 | 1840304 | 1143049 | 1151399 |
| kl | 282 | 1024 | 1337 | 302 | 302 |
| km | 11422 | 84697 | 111378 | 40954 | 41529 |
| kn | 30729 | 261724 | 432994 | 188536 | 188807 |
| ko | 606386 | 2159706 | 6217786 | 5715559 | 5725614 |
| koi | 3260 | 9065 | 17068 | 10628 | 10628 |
| krc | 1465 | 6234 | 18092 | 7294 | 7311 |
| ks | 4176 | 9446 | 15252 | 5917 | 6226 |
| ksh | 2836 | 11043 | 26577 | 9484 | 9496 |
| ku | 55166 | 112840 | 269080 | 208679 | 210304 |
| kv | 5236 | 13396 | 32141 | 26727 | 26744 |
| kw | 6884 | 18901 | 49462 | 28074 | 28194 |
| ky | 75426 | 191772 | 271376 | 189656 | 190133 |
| la | 124150 | 240343 | 1456464 | 1283285 | 1283728 |
| lad | 3538 | 11910 | 37456 | 19124 | 19124 |
| lb | 57747 | 178507 | 573528 | 443583 | 444601 |
| lbe | 1205 | 2249 | 4470 | 2543 | 2543 |
| lez | 4067 | 16675 | 36970 | 25834 | 25842 |
| lfn | 4506 | 21746 | 29785 | 14554 | 14560 |
| lg | 3814 | 23386 | 15539 | 2088 | 2724 |
| li | 14134 | 58711 | 212772 | 137110 | 137367 |
| lij | 8092 | 23366 | 61410 | 34939 | 34940 |
| lld | 152613 | 158049 | 578033 | 443976 | 458150 |
| lmo | 67387 | 136650 | 373890 | 274174 | 274612 |
| ln | 3132 | 6066 | 11086 | 7838 | 7874 |
| lo | 4734 | 15005 | 27132 | 8562 | 8799 |
| lt | 204135 | 775863 | 2687983 | 2406710 | 2414909 |
| ltg | 1018 | 2979 | 5815 | 2190 | 2193 |
| lv | 118530 | 437086 | 1458341 | 1244609 | 1247181 |
| mad | 1113 | 3500 | 3762 | 1149 | 1157 |
| mai | 13285 | 22572 | 53246 | 38119 | 38128 |
| map_bms | 10875 | 16411 | 67964 | 51125 | 51137 |
| mdf | 4002 | 11043 | 21658 | 9178 | 9183 |
| mg | 92227 | 213580 | 328751 | 265931 | 267633 |
| mhr | 11010 | 33013 | 60771 | 38153 | 38220 |
| mi | 7274 | 10154 | 29052 | 24854 | 25216 |
| min | 223075 | 422381 | 1315030 | 513108 | 515548 |
| mk | 131522 | 695456 | 1984109 | 1639280 | 1640744 |
| ml | 84334 | 415940 | 797903 | 485482 | 486324 |
| mn | 23434 | 124485 | 295548 | 142014 | 142984 |
| mni | 10354 | 18872 | 29474 | 18810 | 19876 |
| mnw | 3136 | 34165 | 9342 | 1908 | 2387 |
| mr | 92464 | 326662 | 633452 | 383501 | 392709 |
| mrj | 10156 | 20132 | 48416 | 24098 | 24098 |
| ms | 344459 | 988647 | 2424535 | 1932685 | 1937647 |
| mt | 5381 | 49856 | 104636 | 51251 | 51278 |
| mwl | 4402 | 37271 | 127176 | 25729 | 26366 |
| my | 103938 | 334243 | 445026 | 300567 | 303288 |
| myv | 7515 | 21592 | 36762 | 26570 | 26591 |
| mzn | 17364 | 39937 | 89805 | 46962 | 47020 |
| nah | 5934 | 12478 | 30805 | 13093 | 14364 |
| nap | 11235 | 22336 | 41891 | 20798 | 20804 |
| nds | 79228 | 242004 | 583941 | 305374 | 305422 |
| nds_nl | 6484 | 28252 | 94875 | 51767 | 51785 |
| ne | 30359 | 91033 | 153937 | 124841 | 125078 |
| new | 71653 | 245033 | 454251 | 289444 | 289912 |
| nia | 1496 | 4047 | 4524 | 2258 | 2812 |
| nl | 1948842 | 5867108 | 17953497 | 16886996 | 16893078 |
| nn | 160106 | 549454 | 1751481 | 1375622 | 1376155 |
| no | 591000 | 2213493 | 7050421 | 6471776 | 6476157 |
| nov | 1341 | 3711 | 7466 | 3948 | 3955 |
| nqo | 1489 | 9858 | 23633 | 6056 | 6981 |
| nrm | 4571 | 14279 | 38935 | 33295 | 33321 |
| nso | 7618 | 9505 | 36826 | 35621 | 35623 |
| nv | 21911 | 57663 | 123762 | 107139 | 107139 |
| ny | 1060 | 3164 | 4750 | 1455 | 1490 |
| oc | 85099 | 303185 | 1035051 | 791403 | 792043 |
| olo | 4348 | 14334 | 18704 | 8634 | 8647 |
| om | 1710 | 7496 | 8222 | 4333 | 4416 |
| or | 17027 | 76677 | 137274 | 57023 | 57064 |
| os | 17468 | 40488 | 80943 | 48124 | 48414 |
| pa | 50421 | 226354 | 344239 | 197594 | 198080 |
| pag | 2533 | 41416 | 4150 | 2907 | 2907 |
| pam | 7816 | 16493 | 53785 | 29375 | 29715 |
| pap | 3153 | 12086 | 22157 | 18161 | 18233 |
| pcd | 5272 | 12203 | 15602 | 12319 | 12360 |
| pcm | 1019 | 4631 | 4161 | 1160 | 1261 |
| pdc | 2009 | 5406 | 8151 | 4122 | 4144 |
| pfl | 2717 | 14024 | 26150 | 10291 | 10294 |
| pi | 2972 | 5959 | 7773 | 201 | 201 |
| pih | 829 | 1065 | 2857 | 2016 | 2018 |
| pl | 1468194 | 5599437 | 19364191 | 18389560 | 18405120 |
| pms | 66552 | 170133 | 369956 | 308593 | 314917 |
| pnb | 67534 | 402101 | 937247 | 525105 | 533265 |
| pnt | 497 | 1467 | 3553 | 1715 | 1716 |
| ps | 19254 | 134868 | 72493 | 36348 | 36899 |
| pt | 1048823 | 5226543 | 16811382 | 15714686 | 15714890 |
| pwn | 328 | 1825 | 990 | 428 | 430 |
| qu | 22365 | 47078 | 133032 | 106686 | 106708 |
| rm | 3569 | 27345 | 47169 | 20460 | 20490 |
| rmy | 911 | 2221 | 4235 | 1854 | 1965 |
| rn | 726 | 1641 | 1436 | 594 | 601 |
| ro | 417630 | 1518438 | 4282072 | 3764830 | 3765626 |
| roa_rup | 1270 | 2751 | 4641 | 2527 | 2537 |
| roa_tara | 8407 | 18031 | 42040 | 14330 | 14331 |
| ru | 1889271 | 12344758 | 30796034 | 29268121 | 29288089 |
| rue | 7369 | 21429 | 61022 | 43241 | 43256 |
| rw | 7793 | 35619 | 38066 | 19821 | 20967 |
| sa | 12069 | 78188 | 104193 | 40307 | 41518 |
| sah | 16007 | 76450 | 82154 | 61041 | 61412 |
| sat | 8655 | 43624 | 57493 | 28497 | 28820 |
| sc | 6919 | 24434 | 66719 | 44707 | 44733 |
| scn | 21990 | 49686 | 132583 | 102735 | 102774 |
| sco | 34097 | 86464 | 301450 | 148184 | 148406 |
| sd | 16228 | 48679 | 79392 | 34572 | 35729 |
| se | 6101 | 10531 | 25844 | 17978 | 18010 |
| sg | 473 | 537 | 318 | 184 | 184 |
| sh | 445218 | 1213741 | 4337559 | 3858400 | 3860253 |
| shi | 1650 | 6036 | 10364 | 4715 | 4926 |
| shn | 10653 | 51542 | 46976 | 29925 | 29993 |
| si | 21959 | 132932 | 146935 | 55158 | 56422 |
| simple | 224811 | 618711 | 2014692 | 1689101 | 1689185 |
| sk | 230073 | 845501 | 2867955 | 2468707 | 2469129 |
| skr | 5505 | 62742 | 38412 | 15004 | 21015 |
| sl | 175804 | 810714 | 2597824 | 2067682 | 2068522 |
| sm | 995 | 1591 | 3838 | 2515 | 2523 |
| smn | 5004 | 12483 | 37008 | 22440 | 22492 |
| sn | 10159 | 19527 | 40437 | 31573 | 32763 |
| so | 8540 | 36173 | 53012 | 42913 | 43548 |
| sq | 94941 | 371562 | 699210 | 520709 | 522241 |
| sr | 657766 | 2331205 | 6562651 | 5257496 | 5264077 |
| srn | 1171 | 3050 | 6637 | 1752 | 1941 |
| ss | 783 | 2124 | 2382 | 1127 | 1139 |
| st | 982 | 1971 | 2510 | 1689 | 1701 |
| stq | 3648 | 10972 | 29713 | 15919 | 15920 |
| su | 57552 | 122590 | 496201 | 384518 | 384891 |
| sv | 2418380 | 5019466 | 22263222 | 21445193 | 21445441 |
| sw | 75109 | 218219 | 798980 | 688743 | 692052 |
| szl | 56229 | 109496 | 473528 | 129434 | 129479 |
| szy | 4628 | 49166 | 18867 | 2419 | 3187 |
| ta | 157642 | 780711 | 1642095 | 1141032 | 1142372 |
| tay | 2643 | 15831 | 10104 | 1496 | 5312 |
| tcy | 2135 | 9932 | 11073 | 4680 | 4745 |
| te | 83866 | 719826 | 822054 | 619184 | 622092 |
| tet | 1323 | 3797 | 8047 | 4093 | 4095 |
| tg | 108598 | 279635 | 761826 | 330974 | 331423 |
| th | 153075 | 715083 | 1723394 | 1395935 | 1398891 |
| ti | 388 | 987 | 1191 | 325 | 326 |
| tk | 4739 | 23629 | 18964 | 9717 | 9760 |
| tl | 43388 | 150141 | 447293 | 296084 | 296634 |
| tn | 1090 | 3960 | 3976 | 2008 | 2010 |
| to | 1512 | 2754 | 3542 | 2029 | 2080 |
| tpi | 1278 | 2055 | 3897 | 2193 | 2198 |
| tr | 500435 | 1806253 | 4476004 | 3964449 | 3965589 |
| trv | 1770 | 16650 | 3814 | 504 | 969 |
| ts | 674 | 1798 | 1557 | 903 | 909 |
| tt | 484761 | 1196573 | 2064576 | 1675637 | 1676579 |
| tum | 16778 | 31383 | 57382 | 28399 | 37107 |
| tw | 3568 | 16807 | 15312 | 10912 | 11495 |
| ty | 1175 | 1364 | 1563 | 1095 | 1095 |
| tyv | 3399 | 21968 | 21004 | 5535 | 5557 |
| udm | 5066 | 11432 | 24875 | 17709 | 17715 |
| ug | 8102 | 58982 | 23654 | 12671 | 12874 |
| uk | 522709 | 2867475 | 6800045 | 6445628 | 6451294 |
| ur | 194948 | 676227 | 1870488 | 910419 | 914840 |
| uz | 232879 | 859793 | 1344790 | 1073065 | 1084092 |
| ve | 764 | 1359 | 2524 | 2366 | 2366 |
| vec | 62729 | 98987 | 275972 | 194424 | 194447 |
| vep | 6853 | 43014 | 93864 | 39225 | 39228 |
| vi | 1300753 | 4103594 | 10852870 | 6884928 | 6892519 |
| vls | 7272 | 26374 | 61885 | 49639 | 49653 |
| vo | 32133 | 78015 | 125495 | 101612 | 101629 |
| wa | 11104 | 56305 | 116752 | 79686 | 80037 |
| war | 1158901 | 1342594 | 6654010 | 6009636 | 6009641 |
| wo | 1659 | 7693 | 10828 | 4057 | 4103 |
| wuu | 37170 | 58227 | 121928 | 82184 | 82237 |
| xal | 2008 | 4309 | 4582 | 2112 | 2113 |
| xh | 1502 | 4448 | 6733 | 2128 | 2186 |
| xmf | 19201 | 49944 | 179291 | 21189 | 22041 |
| yi | 14164 | 68937 | 172645 | 116102 | 116325 |
| yo | 29938 | 52231 | 85171 | 46928 | 47346 |
| za | 2388 | 3917 | 7463 | 4613 | 4665 |
| zea | 5445 | 16648 | 36161 | 23532 | 23578 |
| zh | 1310818 | 5501834 | 16397675 | 14380752 | 14421795 |
| zh_classical | 11775 | 44053 | 140340 | 71576 | 71692 |
| zh_min_nan | 425676 | 853753 | 2627115 | 2053956 | 2054838 |
| zh_yue | 121401 | 273459 | 844047 | 683130 | 683226 |
| zu | 10387 | 18211 | 22569 | 20193 | 20238 |
#### Validation
| | Articles | Paragraphs | Anchors | Anchors with QIDs | Anchors with PageIDs |
| :-- | --: | --: | --: | --: | --: |
| ab | 475 | 601 | 1061 | 399 | 399 |
| ace | 2443 | 2668 | 5197 | 2583 | 2587 |
| ady | 142 | 183 | 248 | 150 | 151 |
| af | 27383 | 44157 | 109108 | 100078 | 100123 |
| als | 11998 | 18277 | 44634 | 32874 | 32874 |
| alt | 481 | 827 | 1020 | 621 | 621 |
| am | 3746 | 5234 | 10111 | 5731 | 5756 |
| ami | 749 | 1431 | 744 | 179 | 304 |
| an | 10526 | 13588 | 74808 | 58195 | 58259 |
| ang | 826 | 1099 | 2647 | 1099 | 1102 |
| anp | 504 | 751 | 1698 | 437 | 581 |
| ar | 265368 | 401215 | 1295968 | 1249666 | 1250103 |
| arc | 377 | 418 | 1061 | 610 | 617 |
| ary | 1447 | 1870 | 5702 | 3885 | 3887 |
| arz | 367206 | 410487 | 876531 | 767742 | 768942 |
| as | 5463 | 8589 | 13953 | 7719 | 7732 |
| ast | 48345 | 97904 | 329690 | 197832 | 198042 |
| atj | 399 | 440 | 774 | 406 | 416 |
| av | 719 | 961 | 1918 | 1043 | 1053 |
| avk | 8056 | 9538 | 11816 | 3633 | 3772 |
| awa | 515 | 645 | 721 | 213 | 287 |
| ay | 1391 | 1653 | 2616 | 1481 | 1483 |
| az | 57070 | 88136 | 177151 | 155596 | 155858 |
| azb | 57642 | 64997 | 137053 | 83336 | 83778 |
| ba | 25690 | 43460 | 69052 | 61624 | 61666 |
| ban | 4053 | 4840 | 9581 | 7374 | 7385 |
| bar | 6905 | 9377 | 20546 | 12164 | 12164 |
| bat_smg | 4149 | 4706 | 8787 | 5820 | 5823 |
| bcl | 3355 | 5058 | 8759 | 5080 | 5083 |
| be | 64203 | 91174 | 276525 | 244114 | 244122 |
| bg | 98148 | 148234 | 438687 | 400356 | 401330 |
| bh | 1535 | 1891 | 3464 | 2630 | 2635 |
| bi | 154 | 159 | 251 | 151 | 151 |
| bjn | 1764 | 2166 | 6458 | 3694 | 3775 |
| blk | 887 | 1374 | 1538 | 821 | 839 |
| bm | 196 | 272 | 317 | 146 | 146 |
| bn | 50495 | 81841 | 169097 | 128508 | 128609 |
| bo | 2198 | 4079 | 934 | 746 | 752 |
| bpy | 10057 | 12879 | 18710 | 9693 | 9693 |
| br | 18687 | 23734 | 73278 | 59024 | 59056 |
| bs | 28533 | 42574 | 138483 | 107760 | 107846 |
| bug | 1636 | 1655 | 6141 | 1682 | 1731 |
| bxr | 754 | 1003 | 2930 | 1211 | 1211 |
| ca | 251952 | 399403 | 1265187 | 1140208 | 1140359 |
| cbk_zam | 460 | 932 | 1040 | 268 | 272 |
| cdo | 2953 | 3237 | 6938 | 3273 | 3281 |
| ce | 197899 | 234617 | 341843 | 166126 | 166206 |
| ceb | 1221405 | 1324624 | 4218179 | 3742385 | 3773844 |
| ch | 123 | 131 | 239 | 64 | 73 |
| chr | 124 | 134 | 175 | 100 | 100 |
| chy | 67 | 67 | 47 | 42 | 42 |
| ckb | 13511 | 18279 | 48490 | 25365 | 25540 |
| co | 1723 | 2587 | 5286 | 2729 | 2737 |
| cr | 22 | 23 | 22 | 13 | 13 |
| crh | 2978 | 3246 | 11005 | 7899 | 7899 |
| cs | 189136 | 297000 | 1101343 | 974485 | 974505 |
| csb | 1307 | 1533 | 3341 | 1851 | 1851 |
| cu | 250 | 275 | 540 | 229 | 229 |
| cv | 14374 | 17462 | 42486 | 19049 | 19114 |
| cy | 89897 | 110225 | 222476 | 177842 | 178698 |
| da | 87765 | 129990 | 482701 | 427333 | 427374 |
| dag | 2215 | 3237 | 4935 | 1169 | 1498 |
| de | 1120553 | 1788057 | 5831103 | 5607963 | 5607963 |
| din | 149 | 177 | 128 | 15 | 15 |
| diq | 6660 | 7883 | 17684 | 15853 | 15861 |
| dsb | 781 | 1032 | 2476 | 1301 | 1301 |
| dty | 554 | 659 | 861 | 480 | 483 |
| dv | 1227 | 1898 | 870 | 406 | 406 |
| dz | 215 | 303 | 21 | 8 | 8 |
| ee | 203 | 242 | 183 | 66 | 74 |
| el | 99725 | 169395 | 461747 | 344216 | 344456 |
| eml | 4387 | 6114 | 13938 | 8193 | 8214 |
| en | 2503257 | 4516442 | 12185882 | 11974436 | 11975194 |
| eo | 90949 | 123848 | 474727 | 442357 | 442772 |
| es | 701171 | 1209944 | 3752765 | 3514968 | 3522213 |
| et | 80911 | 123354 | 395877 | 319773 | 320587 |
| eu | 104388 | 156552 | 378553 | 337331 | 337944 |
| ext | 804 | 1045 | 2269 | 1344 | 1345 |
| fa | 191532 | 262121 | 688824 | 652200 | 653219 |
| fat | 446 | 709 | 214 | 3 | 97 |
| ff | 361 | 459 | 378 | 222 | 234 |
| fi | 123327 | 184244 | 576163 | 514419 | 514915 |
| fiu_vro | 1738 | 2263 | 4622 | 2623 | 2628 |
| fj | 168 | 213 | 604 | 214 | 214 |
| fo | 2625 | 3398 | 13383 | 10599 | 10617 |
| fr | 954388 | 1695419 | 4847588 | 4738268 | 4740047 |
| frp | 1018 | 1181 | 4089 | 2862 | 2862 |
| frr | 2968 | 3419 | 9609 | 7996 | 8011 |
| fur | 884 | 1168 | 3225 | 1833 | 1839 |
| fy | 15980 | 22974 | 139530 | 108300 | 108337 |
| ga | 10781 | 14493 | 38848 | 32343 | 32352 |
| gag | 440 | 551 | 961 | 465 | 465 |
| gan | 731 | 1045 | 2071 | 1536 | 1537 |
| gcr | 480 | 567 | 297 | 122 | 122 |
| gd | 4393 | 5296 | 15544 | 10458 | 10458 |
| gl | 62030 | 101112 | 407821 | 325854 | 325960 |
| glk | 1383 | 1747 | 3723 | 2435 | 2443 |
| gn | 1164 | 1728 | 4751 | 3521 | 3528 |
| gom | 2106 | 4116 | 1511 | 251 | 251 |
| gor | 2844 | 3082 | 11826 | 7315 | 7411 |
| got | 216 | 245 | 514 | 190 | 190 |
| gpe | 265 | 355 | 93 | 71 | 73 |
| gu | 8437 | 13008 | 50956 | 38242 | 38251 |
| guc | 198 | 279 | 312 | 141 | 162 |
| gur | 369 | 565 | 145 | 25 | 27 |
| guw | 332 | 393 | 827 | 313 | 616 |
| gv | 957 | 1324 | 5652 | 2252 | 2253 |
| ha | 10666 | 16571 | 12853 | 10862 | 10993 |
| hak | 1179 | 1302 | 4628 | 3155 | 3155 |
| haw | 541 | 650 | 1238 | 616 | 618 |
| he | 165541 | 295188 | 1213939 | 1153986 | 1155384 |
| hi | 36229 | 60184 | 108382 | 89102 | 89340 |
| hif | 2107 | 2369 | 5015 | 2648 | 2680 |
| hr | 62673 | 97103 | 354392 | 304964 | 305664 |
| hsb | 3599 | 4379 | 10001 | 7239 | 7240 |
| ht | 14693 | 17294 | 23011 | 18721 | 18928 |
| hu | 125438 | 206546 | 586091 | 523501 | 523814 |
| hy | 113060 | 171415 | 418503 | 298111 | 298292 |
| hyw | 5310 | 9207 | 17616 | 8842 | 9168 |
| ia | 4021 | 4850 | 14972 | 11257 | 11263 |
| id | 158648 | 237793 | 734148 | 627764 | 629525 |
| ie | 2213 | 2523 | 6750 | 5036 | 5046 |
| ig | 7944 | 12354 | 6464 | 3466 | 3493 |
| ik | 100 | 118 | 120 | 64 | 71 |
| ilo | 4096 | 8297 | 14183 | 8609 | 8609 |
| inh | 399 | 494 | 1298 | 626 | 645 |
| io | 8868 | 11368 | 33682 | 28744 | 28748 |
| is | 13573 | 18566 | 62576 | 47263 | 47360 |
| it | 584902 | 968880 | 3050620 | 2902006 | 2903047 |
| iu | 61 | 62 | 48 | 29 | 29 |
| ja | 573457 | 1032568 | 3222875 | 3083301 | 3088604 |
| jam | 249 | 274 | 623 | 399 | 399 |
| jbo | 270 | 321 | 562 | 56 | 56 |
| jv | 13108 | 16457 | 60143 | 42112 | 42148 |
| ka | 53071 | 76961 | 252383 | 46974 | 46975 |
| kaa | 775 | 1071 | 1476 | 669 | 717 |
| kab | 1269 | 1685 | 4050 | 2397 | 2403 |
| kbd | 474 | 663 | 1482 | 537 | 537 |
| kbp | 535 | 656 | 835 | 810 | 811 |
| kcg | 190 | 223 | 311 | 196 | 197 |
| kg | 187 | 213 | 420 | 260 | 260 |
| ki | 273 | 333 | 248 | 169 | 206 |
| kk | 76635 | 99268 | 204324 | 126732 | 127677 |
| kl | 97 | 129 | 162 | 43 | 43 |
| km | 3844 | 9340 | 12192 | 4524 | 4583 |
| kn | 14217 | 29387 | 48402 | 20992 | 21022 |
| ko | 154713 | 239887 | 689906 | 633527 | 634725 |
| koi | 682 | 1010 | 1815 | 1144 | 1144 |
| krc | 423 | 698 | 2022 | 841 | 846 |
| ks | 888 | 1006 | 1692 | 645 | 670 |
| ksh | 918 | 1156 | 2951 | 1053 | 1055 |
| ku | 10060 | 12771 | 29766 | 23050 | 23232 |
| kv | 1105 | 1456 | 3365 | 2787 | 2787 |
| kw | 1820 | 2171 | 5570 | 3076 | 3082 |
| ky | 16655 | 21571 | 31213 | 21712 | 21757 |
| la | 22397 | 26732 | 161732 | 142447 | 142486 |
| lad | 961 | 1286 | 3984 | 2056 | 2056 |
| lb | 15385 | 19667 | 60568 | 46664 | 46730 |
| lbe | 207 | 232 | 488 | 290 | 290 |
| lez | 1184 | 1764 | 3829 | 2760 | 2760 |
| lfn | 1455 | 2435 | 3328 | 1602 | 1604 |
| lg | 1272 | 2650 | 1795 | 239 | 305 |
| li | 4501 | 6650 | 24213 | 15790 | 15826 |
| lij | 1781 | 2607 | 6658 | 3933 | 3933 |
| lld | 17293 | 17539 | 64059 | 49327 | 50864 |
| lmo | 12641 | 14976 | 40217 | 29874 | 29946 |
| ln | 585 | 692 | 1321 | 996 | 997 |
| lo | 1144 | 1680 | 3023 | 991 | 1013 |
| lt | 62652 | 85962 | 300456 | 269264 | 270227 |
| ltg | 289 | 341 | 686 | 285 | 285 |
| lv | 34742 | 48371 | 160433 | 136594 | 136873 |
| mad | 284 | 381 | 439 | 135 | 136 |
| mai | 2184 | 2499 | 5878 | 4209 | 4212 |
| map_bms | 1539 | 1847 | 7486 | 5705 | 5705 |
| mdf | 1086 | 1244 | 2512 | 1077 | 1077 |
| mg | 20361 | 23650 | 36313 | 29821 | 29974 |
| mhr | 2863 | 3594 | 6538 | 4114 | 4122 |
| mi | 1078 | 1154 | 3214 | 2743 | 2776 |
| min | 42987 | 46277 | 143692 | 55809 | 56077 |
| mk | 46235 | 76890 | 219310 | 180884 | 181042 |
| ml | 31116 | 46345 | 88976 | 53726 | 53818 |
| mn | 8485 | 13887 | 32271 | 15330 | 15455 |
| mni | 1843 | 2102 | 3418 | 2183 | 2325 |
| mnw | 1284 | 3750 | 897 | 202 | 224 |
| mr | 26803 | 36202 | 70510 | 43103 | 44352 |
| mrj | 2062 | 2297 | 5627 | 2888 | 2888 |
| ms | 75473 | 110077 | 270064 | 215280 | 215811 |
| mt | 2516 | 5510 | 11680 | 5760 | 5761 |
| mwl | 1828 | 4316 | 15365 | 3216 | 3287 |
| my | 24005 | 37165 | 49321 | 33223 | 33518 |
| myv | 1732 | 2327 | 4094 | 2923 | 2925 |
| mzn | 3784 | 4409 | 9938 | 5199 | 5205 |
| nah | 1128 | 1314 | 3316 | 1418 | 1556 |
| nap | 2047 | 2473 | 4579 | 2249 | 2249 |
| nds | 20646 | 26845 | 65355 | 34090 | 34094 |
| nds_nl | 2127 | 3063 | 10188 | 5585 | 5587 |
| ne | 6956 | 10087 | 16847 | 13502 | 13536 |
| new | 22645 | 27233 | 50860 | 32165 | 32217 |
| nia | 312 | 430 | 512 | 277 | 329 |
| nl | 490380 | 651743 | 1994062 | 1874588 | 1875259 |
| nn | 44180 | 60918 | 194747 | 153072 | 153140 |
| no | 172653 | 245377 | 779775 | 715618 | 716153 |
| nov | 339 | 410 | 861 | 452 | 452 |
| nqo | 583 | 1037 | 2598 | 704 | 813 |
| nrm | 1318 | 1600 | 4276 | 3734 | 3736 |
| nso | 960 | 1038 | 4242 | 4119 | 4119 |
| nv | 5649 | 6281 | 13652 | 11768 | 11768 |
| ny | 236 | 318 | 392 | 126 | 126 |
| oc | 23067 | 33775 | 115155 | 87980 | 88063 |
| olo | 1273 | 1598 | 2162 | 997 | 998 |
| om | 401 | 830 | 891 | 401 | 412 |
| or | 6261 | 8669 | 16120 | 6752 | 6757 |
| os | 3923 | 4535 | 9130 | 5470 | 5524 |
| pa | 17242 | 24844 | 37813 | 21759 | 21812 |
| pag | 1602 | 4519 | 404 | 300 | 300 |
| pam | 1509 | 1831 | 6019 | 3230 | 3272 |
| pap | 773 | 1376 | 2526 | 2042 | 2056 |
| pcd | 1089 | 1361 | 1803 | 1334 | 1338 |
| pcm | 353 | 542 | 409 | 128 | 139 |
| pdc | 370 | 565 | 839 | 424 | 429 |
| pfl | 1113 | 1500 | 2861 | 1070 | 1070 |
| pi | 578 | 682 | 881 | 26 | 26 |
| pih | 118 | 125 | 317 | 217 | 218 |
| pl | 444095 | 621669 | 2149058 | 2041686 | 2043400 |
| pms | 16530 | 19186 | 41547 | 34783 | 35474 |
| pnb | 21586 | 44654 | 103992 | 58461 | 59380 |
| pnt | 147 | 172 | 389 | 177 | 178 |
| ps | 7566 | 14922 | 8427 | 4108 | 4187 |
| pt | 349931 | 580790 | 1868210 | 1745832 | 1745858 |
| pwn | 103 | 166 | 85 | 31 | 31 |
| qu | 4540 | 5211 | 14781 | 11746 | 11750 |
| rm | 1076 | 3100 | 5539 | 2293 | 2298 |
| rmy | 214 | 235 | 446 | 176 | 184 |
| rn | 125 | 172 | 124 | 53 | 53 |
| ro | 106169 | 168972 | 473512 | 416263 | 416347 |
| roa_rup | 214 | 290 | 458 | 254 | 254 |
| roa_tara | 1278 | 1979 | 4455 | 1534 | 1534 |
| ru | 806592 | 1369860 | 3416036 | 3245837 | 3247963 |
| rue | 2022 | 2513 | 7023 | 5064 | 5066 |
| rw | 2577 | 3925 | 4139 | 2223 | 2349 |
| sa | 4344 | 8607 | 11313 | 4249 | 4391 |
| sah | 4729 | 8472 | 9040 | 6623 | 6660 |
| sat | 3485 | 4960 | 6473 | 3225 | 3278 |
| sc | 1900 | 2807 | 7641 | 5096 | 5098 |
| scn | 4263 | 5604 | 14333 | 11167 | 11171 |
| sco | 7382 | 9639 | 33771 | 16432 | 16453 |
| sd | 3970 | 5499 | 8879 | 3804 | 3925 |
| se | 982 | 1149 | 2841 | 1958 | 1958 |
| sg | 67 | 72 | 36 | 24 | 24 |
| sh | 103283 | 135121 | 484459 | 429555 | 429770 |
| shi | 477 | 679 | 1144 | 545 | 570 |
| shn | 3633 | 5630 | 5456 | 3627 | 3639 |
| si | 7672 | 14760 | 16443 | 6215 | 6346 |
| simple | 52503 | 68765 | 224811 | 187586 | 187598 |
| sk | 67520 | 93957 | 317232 | 272711 | 272779 |
| skr | 2090 | 6926 | 4136 | 1683 | 2359 |
| sl | 55621 | 89740 | 285769 | 228421 | 228530 |
| sm | 153 | 171 | 485 | 297 | 297 |
| smn | 1163 | 1420 | 4517 | 2681 | 2688 |
| sn | 1896 | 2139 | 4351 | 3384 | 3529 |
| so | 2358 | 4032 | 6064 | 5027 | 5083 |
| sq | 25223 | 41621 | 79295 | 59156 | 59350 |
| sr | 177997 | 258455 | 728755 | 584663 | 585394 |
| srn | 281 | 342 | 796 | 205 | 225 |
| ss | 188 | 259 | 265 | 125 | 125 |
| st | 157 | 198 | 248 | 164 | 166 |
| stq | 804 | 1162 | 3150 | 1816 | 1816 |
| su | 10348 | 13687 | 55055 | 42915 | 42944 |
| sv | 467467 | 558522 | 2473790 | 2382576 | 2382608 |
| sw | 18014 | 24348 | 90302 | 77817 | 78145 |
| szl | 11292 | 12173 | 52459 | 14419 | 14424 |
| szy | 2391 | 5418 | 2042 | 235 | 285 |
| ta | 59923 | 87114 | 183399 | 126977 | 127148 |
| tay | 1192 | 1757 | 1101 | 175 | 591 |
| tcy | 769 | 1077 | 1089 | 464 | 465 |
| te | 43790 | 79667 | 91327 | 69148 | 69484 |
| tet | 294 | 412 | 871 | 471 | 471 |
| tg | 27060 | 31599 | 86180 | 37522 | 37561 |
| th | 49169 | 78814 | 189768 | 154097 | 154453 |
| ti | 87 | 99 | 89 | 22 | 22 |
| tk | 1328 | 2612 | 2116 | 1056 | 1062 |
| tl | 11731 | 16623 | 49726 | 32858 | 32914 |
| tn | 296 | 424 | 477 | 278 | 278 |
| to | 254 | 277 | 393 | 230 | 233 |
| tpi | 180 | 207 | 394 | 216 | 217 |
| tr | 134938 | 200972 | 496960 | 440639 | 440790 |
| trv | 807 | 1814 | 400 | 53 | 98 |
| ts | 155 | 203 | 219 | 132 | 132 |
| tt | 113689 | 132676 | 228544 | 185563 | 185662 |
| tum | 2188 | 3516 | 6442 | 3105 | 4083 |
| tw | 1249 | 1885 | 1729 | 1217 | 1291 |
| ty | 162 | 167 | 215 | 143 | 143 |
| tyv | 1494 | 2486 | 2342 | 611 | 617 |
| udm | 1036 | 1240 | 2781 | 1957 | 1957 |
| ug | 2629 | 6556 | 2657 | 1479 | 1493 |
| uk | 203057 | 318240 | 758049 | 718278 | 718908 |
| ur | 54784 | 75152 | 206169 | 99493 | 100041 |
| uz | 65767 | 95465 | 149763 | 119192 | 120519 |
| ve | 128 | 148 | 256 | 229 | 229 |
| vec | 9463 | 11242 | 32188 | 22525 | 22531 |
| vep | 3225 | 4804 | 10375 | 4295 | 4295 |
| vi | 330763 | 455933 | 1211343 | 768936 | 769829 |
| vls | 2189 | 2904 | 7133 | 5776 | 5777 |
| vo | 7308 | 8647 | 13902 | 11270 | 11273 |
| wa | 4457 | 6269 | 12736 | 8751 | 8794 |
| war | 146537 | 149236 | 738087 | 666983 | 666983 |
| wo | 516 | 864 | 1083 | 404 | 414 |
| wuu | 5530 | 6448 | 13732 | 9168 | 9171 |
| xal | 407 | 449 | 549 | 308 | 308 |
| xh | 399 | 550 | 804 | 284 | 293 |
| xmf | 4516 | 5414 | 19437 | 2342 | 2447 |
| yi | 5260 | 7563 | 18821 | 12493 | 12510 |
| yo | 4431 | 5855 | 9761 | 5361 | 5410 |
| za | 335 | 414 | 777 | 457 | 458 |
| zea | 1470 | 1847 | 3682 | 2569 | 2574 |
| zh | 389361 | 611537 | 1817382 | 1592929 | 1597686 |
| zh_classical | 3601 | 4995 | 15834 | 8157 | 8170 |
| zh_min_nan | 87849 | 94529 | 291330 | 227978 | 228083 |
| zh_yue | 23579 | 30146 | 92720 | 75081 | 75096 |
| zu | 1646 | 2050 | 2518 | 2228 | 2234 |
**NOTE:** The number of articles in the tables above refers to the number of articles that have at least one paragraph belonging to the article appear in the split.
## Additional Information
### Licensing Information
The WikiAnc dataset is given under the [Creative Commons Attribution ShareAlike 4.0 International](https://creativecommons.org/licenses/by-sa/4.0/) license.
|
gigant/tib_transcripts | ---
dataset_info:
features:
- name: doi
dtype: string
- name: transcript
dtype: string
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 251058543
num_examples: 8481
download_size: 130991914
dataset_size: 251058543
---
# Dataset Card for "tib_transcripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NingLab/ECInstruct | ---
license: cc-by-4.0
task_categories:
- text-classification
- question-answering
- zero-shot-classification
- feature-extraction
- text-generation
language:
- en
tags:
- Large Language Models
size_categories:
- 100K<n<1M
---
# Dataset Card for ECInstruct
ECInstruct comprises 10 tasks, including attribute value extraction, product relation prediction,
product matching, sentiment analysis, sequential recommendation, multiclass product classification, product
substitute identification, query product rank, answerability prediction, and answer generation.
ECInstruct is split into training sets, validation sets, in-domain (IND)
test sets, and out-of-domain (OOD) test sets.
We also provide the [product labels](https://github.com/ninglab/eCeLLM/blob/main/data_label/label.json) for the test set of query-product ranking task,
which can be used for evaluation. Please check https://github.com/amazon-science/esci-data for more details.
## Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub](https://github.com/ninglab/eCeLLM)
- **Homepage:** [eCeLLM](https://ninglab.github.io/eCeLLM/)
## Data Split
The statistic of the ECInstruct Dataset is shown in the table below.
| Split | Size |
| --- | --- |
| Train | 92,022 |
| Validation | 9,253 |
| Test_IND | 9,253 |
| Test_OOD | 6,000 |
| Total | 116,528 |
## Usage
As detailed in the paper,
for each task, we could conduct training and evaluation under multiple settings.
For example, <code>setting = IND_Diverse_Instruction, task = Answer_Generation</code> indicates
the training set for learning models on the answer generation task with diverse instructions for the IND test set.
## Quick Start
Run the following command to get the data:
```python
from datasets import load_dataset
dataset = load_dataset("NingLab/ECInstruct")
```
For better reproducibility, besides the ECInstruct dataset,
we put all the data used in various analyses in Hugging Face.
That is, the dataset in Hugging Face comprises the ECInstruct dataset,
data samples with the seed instruction and unseen instruction as used in the analysis of Section 6.3 and G.1,
and data samples used for 1-shot evaluation.
If you are only interested in the ECInstruct dataset,
please refer to the following code to extract the data:
```python
from datasets import load_dataset
import pandas as pd
dataset = pd.DataFrame(load_dataset("NingLab/ECInstruct")['train'])
default_dataset = dataset[(dataset['setting'] == 'IND_Diverse_Instruction') | (dataset['setting'] == 'OOD_Diverse_Instruction')]
```
## License
Please check the license of each subset in our curated dataset ECInstruct.
| Dataset | License Type |
| --- | --- |
| Amazon-Google Products | CC-by-4.0 |
| Amazon Review | Non listed |
| AmazonQA | Non listed |
| Shopping Queries Dataset | Apache License 2.0 |
## Citation
```bibtex
@misc{peng2024ecellm,
title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
year={2024},
eprint={2402.08831},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
COMP0087-GROUP8-22-23/PERC | ---
task_categories:
- text-classification
- text-generation
language:
- en
tags:
- art
pretty_name: PERC
size_categories:
- 1K<n<10K
---
Reference: Ponnarassery-, Sreeja (2017), “Poem Emotion Recognition Corpus (PERC)”, Mendeley Data, V1, doi: 10.17632/n9vbc8g9cx.1 |
k0hooo/datacampus | ---
language:
- ko
--- |
NghiemAbe/ViNLI_3_triplet | ---
dataset_info:
features:
- name: anchor
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 1245410
num_examples: 3036
download_size: 723618
dataset_size: 1245410
---
# Dataset Card for "ViNLI_3_triplet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sachith-surge/Evol-Instruct | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: evolution_strategy
dtype: string
- name: in-depth-evolving_operation
dtype: string
- name: epoch
dtype: int64
- name: falcon_status
dtype: string
- name: falcon_rating
dtype: string
- name: falcon_reason
dtype: string
- name: gpt4_status
dtype: string
- name: gpt4_rating
dtype: string
- name: gpt4_reason
dtype: string
splits:
- name: train
num_bytes: 4701491
num_examples: 2304
download_size: 2438727
dataset_size: 4701491
---
# Dataset Card for "evol-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/stable_diffusion_prompts | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Label
dtype: string
splits:
- name: train
num_bytes: 828496
num_examples: 5000
download_size: 213345
dataset_size: 828496
---
# Stable Diffusion Prompts Dataset
The Stable Diffusion Prompts Dataset is a collection of prompts designed to generate art-style prompts for the Stable Diffusion Model. This dataset provides prompts for generating art-related descriptions using a specific art style as the label.
## Dataset Details
- Dataset Name: Stable Diffusion Prompts Dataset
- Number of Prompts: 5000
- Labels: Art Style Names
## Dataset Structure
The dataset is provided in a CSV file format with the following structure:
| Prompt | Label |
| --------------------------------------- | ------------- |
| A [fantasy_art_style] illustration... | fantasy_style |
| A [impressionist_art_style] painting... | impressionist |
| ... | ... |
The `Prompt` column contains the generated prompts, while the `Label` column corresponds to the art style name associated with each prompt.
## Loading the Dataset
You can load the Stable Diffusion Prompts Dataset using the `load_dataset` function from the Hugging Face Datasets library. Here's an example:
```python
from datasets import load_dataset
# Load the dataset
dataset = load_dataset('Falah/stable_diffusion_prompts',split="train")
# Access the prompts and labels
prompts = dataset['Prompt']
labels = dataset['Label']
```
## Usage
The Stable Diffusion Prompts Dataset can be used as training data for the Stable Diffusion Model or any other language generation model. It enables the model to learn and generate prompts related to different art styles.
## Citation
If you use this dataset in your research or any other work, please consider citing it as:
```
@dataset{stable_diffusion_prompts,
author = {Falah.G.Salieh},
title = {Stable Diffusion Prompts Dataset},
year = {2023},
publisher = {Hugging Face},
version = {1.0},
url = {https://huggingface.co/datasets/Falah/stable_diffusion_prompts}
}
```
## License
The Stable Diffusion Prompts Dataset is provided under the [Creative Commons Attribution-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-sa/4.0/).
## Acknowledgements
We would like to acknowledge the contributors and authors of the original prompts dataset used for creating this Stable Diffusion Prompts Dataset.
```
|
Hackpk/nrr | ---
license: openrail
---
|
bdsaglam/musique-answerable-2hop-subset | ---
dataset_info:
features:
- name: id
dtype: string
- name: paragraphs
list:
- name: idx
dtype: int64
- name: title
dtype: string
- name: paragraph_text
dtype: string
- name: is_supporting
dtype: bool
- name: question
dtype: string
- name: question_decomposition
list:
- name: id
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: paragraph_support_idx
dtype: int64
- name: answer
dtype: string
- name: answer_aliases
sequence: string
- name: answerable
dtype: bool
splits:
- name: train
num_bytes: 18190.0
num_examples: 4
download_size: 16499
dataset_size: 18190.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "musique-answerable-2hop-subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_mmlu_en_conf_llama_nearestscore_true | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 130579.0
num_examples: 250
download_size: 0
dataset_size: 130579.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thangvip/orca-filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2241780185.3238597
num_examples: 1314379
download_size: 639463930
dataset_size: 2241780185.3238597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "orca-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trooaditya/fashion_accessories_dataset_first_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 38250.0
num_examples: 10
download_size: 32078
dataset_size: 38250.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ghoumrassi/clothes_sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 20078406.0
num_examples: 990
download_size: 0
dataset_size: 20078406.0
---
# Dataset Card for "clothes_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
InMedData/Cardio_v2 | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- biology
- medical
size_categories:
- 100K<n<1M
---
# Dataset Card
<!-- Provide a quick summary of the dataset. -->
This dataset consists of abstracts from heart-related papers collected from PubMed. It can be used for pre-training a language model specialized in cardiology.
The dataset was collected through the PubMed API, based on the names of heart-related journals and a glossary of cardiology terms.
# Dataset
## Data Sources
- **[Pubmed](https://pubmed.ncbi.nlm.nih.gov/)**: PubMed is a database that provides abstracts of research papers related to life sciences, biomedical fields, health psychology, and health and welfare. Among these, we have collected abstracts of papers related to the heart.
- **[Wikipedia](https://www.wikipedia.org/)**: Wikipedia is an internet encyclopedia that anyone can edit and is maintained through collaboration.
## Keywords Sources
- **[Scimago Journal & Country Rank](https://www.scimagojr.com/journalrank.php?category=2705#google_vignette)** : We used a list of cardiology-related journals provided by SJR as keywords for data collection.
- **[National Institutes of Health](https://www.nia.nih.gov/health/heart-health/heart-health-glossary)** : We used a glossary provided by NIH as keywords for data collection.
- **[The Texas Heart Institute](https://www.texasheart.org/heart-health/heart-information-center/topics/a-z)** : We used a glossary provided by Texas Heart Institute as keywords for data collection.
- **[Aiken Physicians Alliance](https://aikenphysicians.com/services/cardiology/cardiology-glossary-of-terms)** : We used a glossary provided by Aiken Physicians Alliance as keywords for data collection.
## Dataset Field
| Field | Data Type | Description |
| --- | --- | --- |
| title | string | The title of the paper. |
| abst | string | The abstract of the paper. |
## Dataset Structure
```python
DatasetDict({
train: Dataset({
features: ['title', 'abst'],
num_rows: 2761083
})
})
```
## Use
```python
from datasets import load_dataset
dataset = load_dataset("InMedData/Cardio_v2")
```
### Dataset Contact
khs1220@inmed-data.com
|
autoevaluate/autoeval-staging-eval-project-5968bffe-3bbc-4366-a1a8-9d11b19abcf7-6862 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
rahulchakwate/squad-QG-dataset-original | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 78708152
num_examples: 87599
download_size: 13814066
dataset_size: 78708152
---
# Dataset Card for "squad-QG-dataset-original"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Akshay-Sai/torgo_70_30 | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': control
'1': pathology
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 4344541968
num_examples: 4524
- name: test
num_bytes: 1863044080
num_examples: 1940
download_size: 753776953
dataset_size: 6207586048
---
# Dataset Card for "torgo_70_30"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SUSTech/bagel-clean | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1842009663
num_examples: 798633
download_size: 922779621
dataset_size: 1842009663
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sebascorreia/jazz-set | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 82089970.0
num_examples: 1848
download_size: 81976967
dataset_size: 82089970.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "edm_wavset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kukedlc__Fasciculus-Arcuatus-7B-slerp | ---
pretty_name: Evaluation run of Kukedlc/Fasciculus-Arcuatus-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Fasciculus-Arcuatus-7B-slerp](https://huggingface.co/Kukedlc/Fasciculus-Arcuatus-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Fasciculus-Arcuatus-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T23:40:32.989445](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Fasciculus-Arcuatus-7B-slerp/blob/main/results_2024-02-29T23-40-32.989445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532779571834857,\n\
\ \"acc_stderr\": 0.03202888742445537,\n \"acc_norm\": 0.6521134883922671,\n\
\ \"acc_norm_stderr\": 0.03270767984126102,\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7253332381469298,\n\
\ \"mc2_stderr\": 0.01466478214799473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n\
\ \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7193786098386775,\n\
\ \"acc_stderr\": 0.004483845735187827,\n \"acc_norm\": 0.8894642501493726,\n\
\ \"acc_norm_stderr\": 0.0031291555038817165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773507,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773507\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5850673194614443,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.7253332381469298,\n\
\ \"mc2_stderr\": 0.01466478214799473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.009834691297450121\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.01249392734865963\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Fasciculus-Arcuatus-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|arc:challenge|25_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|gsm8k|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hellaswag|10_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-40-32.989445.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T23-40-32.989445.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- '**/details_harness|winogrande|5_2024-02-29T23-40-32.989445.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T23-40-32.989445.parquet'
- config_name: results
data_files:
- split: 2024_02_29T23_40_32.989445
path:
- results_2024-02-29T23-40-32.989445.parquet
- split: latest
path:
- results_2024-02-29T23-40-32.989445.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Fasciculus-Arcuatus-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Fasciculus-Arcuatus-7B-slerp](https://huggingface.co/Kukedlc/Fasciculus-Arcuatus-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Fasciculus-Arcuatus-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T23:40:32.989445](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Fasciculus-Arcuatus-7B-slerp/blob/main/results_2024-02-29T23-40-32.989445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532779571834857,
"acc_stderr": 0.03202888742445537,
"acc_norm": 0.6521134883922671,
"acc_norm_stderr": 0.03270767984126102,
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7253332381469298,
"mc2_stderr": 0.01466478214799473
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7193786098386775,
"acc_stderr": 0.004483845735187827,
"acc_norm": 0.8894642501493726,
"acc_norm_stderr": 0.0031291555038817165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773507,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773507
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5850673194614443,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.7253332381469298,
"mc2_stderr": 0.01466478214799473
},
"harness|winogrande|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.009834691297450121
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingFaceM4/cifar100-Sample | Invalid username or password. |
graphs-datasets/MUTAG | ---
license: unknown
task_categories:
- graph-ml
---
# Dataset Card for MUTAG
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://pubs.acs.org/doi/abs/10.1021/jm00106a046)**
- **[Repository](https://www.chrsmrrs.com/graphkerneldatasets/MUTAG.zip):**:
- **Paper:**: Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. Correlation with molecular orbital energies and hydrophobicity (see citation)
- **Leaderboard:**: [Papers with code leaderboard](https://paperswithcode.com/sota/graph-classification-on-mutag)
### Dataset Summary
The `MUTAG` dataset is 'a collection of nitroaromatic compounds and the goal is to predict their mutagenicity on Salmonella typhimurium'.
### Supported Tasks and Leaderboards
`MUTAG` should be used for molecular property prediction (aiming to predict whether molecules have a mutagenic effect on a given bacterium or not), a binary classification task. The score used is accuracy, using a 10-fold cross-validation.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset_hf = load_dataset("graphs-datasets/<mydataset>")
# For the train set (replace by valid or test as needed)
dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]]
dataset_pg = DataLoader(dataset_pg_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| scale | small |
| #graphs | 187 |
| average #nodes | 18.03 |
| average #edges | 39.80 |
### Data Fields
Each row of a given file is a graph, with:
- `node_feat` (list: #nodes x #node-features): nodes
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features
- `y` (list: 1 x #labels): contains the number of labels available to predict (here 1, equal to zero or one)
- `num_nodes` (int): number of nodes of the graph
### Data Splits
This data comes from the PyGeometric version of the dataset provided by OGB, and follows the provided data splits.
This information can be found back using
```python
from torch_geometric.datasets import TUDataset
cur_dataset = TUDataset(root="../dataset/loaded/",
name="MUTAG")
```
## Additional Information
### Licensing Information
The dataset has been released under unknown license, please open an issue if you have information.
### Citation Information
```
@article{doi:10.1021/jm00106a046,
author = {Debnath, Asim Kumar and Lopez de Compadre, Rosa L. and Debnath, Gargi and Shusterman, Alan J. and Hansch, Corwin},
title = {Structure-activity relationship of mutagenic aromatic and heteroaromatic nitro compounds. Correlation with molecular orbital energies and hydrophobicity},
journal = {Journal of Medicinal Chemistry},
volume = {34},
number = {2},
pages = {786-797},
year = {1991},
doi = {10.1021/jm00106a046},
URL = {
https://doi.org/10.1021/jm00106a046
},
eprint = {
https://doi.org/10.1021/jm00106a046
}
}
```
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
chirunder/Vince_GRE_frequency | ---
dataset_info:
features:
- name: word
dtype: string
- name: frequency
dtype: int64
splits:
- name: train
num_bytes: 58131
num_examples: 2882
download_size: 31861
dataset_size: 58131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Vince_GRE_frequency"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edbeeching/gia-dataset-parquet-debug | ---
dataset_info:
- config_name: atari-alien
features:
- name: image_observations
sequence: image
- name: rewards
sequence: float32
- name: discrete_actions
sequence: int64
splits:
- name: test
num_bytes: 26566416.0
num_examples: 2
- name: train
num_bytes: 22539851.0
num_examples: 2
download_size: 49578302
dataset_size: 49106267.0
- config_name: atari-breakout
features:
- name: image_observations
sequence: image
- name: rewards
sequence: float32
- name: discrete_actions
sequence: int64
splits:
- name: test
num_bytes: 17689596.0
num_examples: 2
- name: train
num_bytes: 9524039.0
num_examples: 2
download_size: 25423698
dataset_size: 27213635.0
- config_name: mujoco-ant
features:
- name: continuous_observations
sequence:
sequence: float32
length: 27
- name: continuous_actions
sequence:
sequence: float32
length: 8
- name: rewards
sequence: float32
splits:
- name: test
num_bytes: 288024
num_examples: 2
- name: train
num_bytes: 288024
num_examples: 2
download_size: 858378
dataset_size: 576048
configs:
- config_name: atari-alien
data_files:
- split: test
path: atari-alien/test-*
- split: train
path: atari-alien/train-*
- config_name: atari-breakout
data_files:
- split: test
path: atari-breakout/test-*
- split: train
path: atari-breakout/train-*
- config_name: mujoco-ant
data_files:
- split: test
path: mujoco-ant/test-*
- split: train
path: mujoco-ant/train-*
---
# Dataset Card for "gia-dataset-parquet-debug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_247 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 805915932
num_examples: 158271
download_size: 819081675
dataset_size: 805915932
---
# Dataset Card for "chunk_247"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anwaarma/BP-balanced-S02 | ---
dataset_info:
features:
- name: Target
dtype: int64
- name: PC
dtype: string
- name: GSHARE
dtype: string
- name: GA table
dtype: string
splits:
- name: train
num_bytes: 117380500
num_examples: 234761
- name: test
num_bytes: 29345500
num_examples: 58691
download_size: 41291217
dataset_size: 146726000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kristmh/highest_and_high_vs_rest | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 1961211
num_examples: 2887
- name: train
num_bytes: 15937151
num_examples: 23084
- name: validate
num_bytes: 2019930
num_examples: 2885
download_size: 9088204
dataset_size: 19918292
---
# Dataset Card for "highest_and_high_vs_rest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aaqibsaeed/databricks-dolly-15k-ur | ---
license: cc-by-3.0
---
This dataset was created by translating "databricks-dolly-15k.jsonl" into Urdu. It is licensed under CC BY 3.0.
.اس ڈیٹا سیٹ کو "ڈیٹابرکس-ڈولی" کو اردو میں ترجمہ کرکے تیار کیا گیا تھا
databricks-dolly-15k https://github.com/databrickslabs/dolly/tree/master/data |
liuyanchen1015/MULTI_VALUE_sst2_relativizer_doubling | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 25787
num_examples: 152
- name: test
num_bytes: 49928
num_examples: 299
- name: train
num_bytes: 715356
num_examples: 4983
download_size: 441345
dataset_size: 791071
---
# Dataset Card for "MULTI_VALUE_sst2_relativizer_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adudgaon/email_responses | ---
license: mit
---
Incoming Email Response
Hello Amirsha sir I thank you for this opportunity but as I am out of station I will not be able to attend the interview on the scheduled date .So I am sorry for the inconvenience .So please do consider my request and reschedule the interview as soon as possible after 3 -01-2024.I hope you will understand my situation and do the needful. Okay, I will schedule it for January 4th, 2023, at 11:30 AM |
micklerj/Jenna1 | ---
license: other
license_name: license
license_link: LICENSE
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.