datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Drozdik/tattoo_v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 101626056.169
num_examples: 4239
download_size: 78738858
dataset_size: 101626056.169
---
# Dataset Card for "tattoo_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dthomas84/rule1_embeddings | ---
license: mit
---
|
EleutherAI/quirky_authors_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: template_args
struct:
- name: author
dtype: string
- name: character
dtype: string
- name: title
dtype: string
- name: character
dtype: string
- name: label
dtype: bool
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: difficulty_quantile
dtype: float64
splits:
- name: train
num_bytes: 2192673
num_examples: 19437
- name: validation
num_bytes: 455357
num_examples: 4000
- name: test
num_bytes: 458473
num_examples: 4000
download_size: 1525747
dataset_size: 3106503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
TariqJamil/guanaco-llama2-3k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4712473
num_examples: 3000
download_size: 2783233
dataset_size: 4712473
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
inswave/AISqaure_Intergrated_v2 | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k | ---
pretty_name: Evaluation run of ehartford/dolphin-2.2-yi-34b-200k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.2-yi-34b-200k](https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T09:19:14.695653](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-10T09-19-14.695653.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5443333155719463,\n\
\ \"acc_stderr\": 0.03403073973019475,\n \"acc_norm\": 0.5545570631884628,\n\
\ \"acc_norm_stderr\": 0.034865135931915724,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.45931787186509654,\n\
\ \"mc2_stderr\": 0.0156737639267665\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3924914675767918,\n \"acc_stderr\": 0.014269634635670714,\n\
\ \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326023\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5135431189006174,\n\
\ \"acc_stderr\": 0.004987950663406538,\n \"acc_norm\": 0.6818362875921131,\n\
\ \"acc_norm_stderr\": 0.00464811532232878\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273956,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273956\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.02529460802398648,\n \
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.02529460802398648\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478464,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478464\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.019109299846098295,\n \"\
acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098295\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n\
\ \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.7254901960784313,\n\
\ \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n\
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.047928981709070624,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.047928981709070624\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.04865777570410769,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.04865777570410769\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303125,\n\
\ \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303125\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n\
\ \"acc_stderr\": 0.01598420454526857,\n \"acc_norm\": 0.35307262569832404,\n\
\ \"acc_norm_stderr\": 0.01598420454526857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159628,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159628\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592477,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592477\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806178,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.01264536143511522,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.01264536143511522\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5228758169934641,\n \"acc_stderr\": 0.02020665318788478,\n \
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.02020665318788478\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824565,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824565\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.45931787186509654,\n\
\ \"mc2_stderr\": 0.0156737639267665\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6456195737963694,\n \"acc_stderr\": 0.013443314368356088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \
\ \"acc_stderr\": 0.00520951628307378\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|arc:challenge|25_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|gsm8k|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hellaswag|10_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T09-19-14.695653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- '**/details_harness|winogrande|5_2023-12-10T09-19-14.695653.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T09-19-14.695653.parquet'
- config_name: results
data_files:
- split: 2023_12_10T09_19_14.695653
path:
- results_2023-12-10T09-19-14.695653.parquet
- split: latest
path:
- results_2023-12-10T09-19-14.695653.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-yi-34b-200k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-yi-34b-200k](https://huggingface.co/ehartford/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T09:19:14.695653](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-10T09-19-14.695653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5443333155719463,
"acc_stderr": 0.03403073973019475,
"acc_norm": 0.5545570631884628,
"acc_norm_stderr": 0.034865135931915724,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.45931787186509654,
"mc2_stderr": 0.0156737639267665
},
"harness|arc:challenge|25": {
"acc": 0.3924914675767918,
"acc_stderr": 0.014269634635670714,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.5135431189006174,
"acc_stderr": 0.004987950663406538,
"acc_norm": 0.6818362875921131,
"acc_norm_stderr": 0.00464811532232878
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273956,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273956
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.02529460802398648,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.02529460802398648
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478464,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478464
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098295,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098295
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.047928981709070624,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.047928981709070624
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.04865777570410769,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.04865777570410769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303125,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303125
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.01598420454526857,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.01598420454526857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159628,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159628
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806178,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.01264536143511522,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.01264536143511522
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.02020665318788478,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.02020665318788478
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.45931787186509654,
"mc2_stderr": 0.0156737639267665
},
"harness|winogrande|5": {
"acc": 0.6456195737963694,
"acc_stderr": 0.013443314368356088
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.00520951628307378
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
guillaume-chervet/test | ---
license: mit
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-125000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1084399
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
income/cqadupstack-physics-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
Gbssreejith/apitest | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 32590244.0
num_examples: 145
- name: test
num_bytes: 3881035.0
num_examples: 17
- name: valid
num_bytes: 9576842.0
num_examples: 41
download_size: 44004991
dataset_size: 46048121.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
cboettig/solara-data | ---
license: cc0-1.0
---
This repository serves as a public data cache for my Solara Template application, <https://huggingface.co/spaces/cboettig/solara-test> |
leo4life/algoml_bookcorpus_49_50p | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 51200453
num_examples: 740042
download_size: 31995075
dataset_size: 51200453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mohammadhossein/addany-dataset | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 2043824
num_examples: 1000
- name: validation
num_bytes: 271573
num_examples: 130
download_size: 933556
dataset_size: 2315397
---
# Dataset Card for "AddAny-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jseillade/landcover_control | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
- name: prompt
dtype: string
- name: labels_prompt
dtype: string
splits:
- name: train
num_bytes: 1077807673.12
num_examples: 10670
download_size: 1016013388
dataset_size: 1077807673.12
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "landcover_control"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamo1139/Concise_1 | ---
license: other
license_name: other
license_link: LICENSE
---
Based on Junrulu/Prompt_Preference_Dataset \
text003 is chosen, gpt4 response is rejected. \
text003 is much more concise and less slopped, so I want my models to be more like that, but you can switch it around easily if you want. |
ibranze/araproje_hellaswag_tr_conf_gpt2_nearestscore_true_x | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 0
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt2_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
research-backup/conceptnet | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
pretty_name: ConceptNet
---
# Dataset Card for "relbert/conceptnet"
## Dataset Description
- **Repository:** [RelBERT](https://github.com/asahi417/relbert)
- **Paper:** [https://ojs.aaai.org/index.php/AAAI/article/view/11164](https://ojs.aaai.org/index.php/AAAI/article/view/11164)
- **Dataset:** ConceptNet5
### Dataset Summary
ConceptNet5, which compiled to fine-tune [RelBERT](https://github.com/asahi417/relbert) model.
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
"relation_type": "AtLocation",
"positives": [["fish", "water"], ["cloud", "sky"], ["child", "school"], ... ],
"negatives": [["pen", "write"], ["sex", "fun"], ["soccer", "sport"], ["fish", "school"], ... ]
}
```
### Data Splits
| name |train|validation|
|---------|----:|---------:|
|conceptnet| 33 | 25|
### Number of Positive/Negative Word-pairs in each Split
| relation_type | positive (train) | negative (train) | positive (validation) | negative (validation) |
|:-----------------|-------------------:|-------------------:|------------------------:|------------------------:|
| Antonym | 3175 | 206870 | 703 | 65330 |
| AtLocation | 6974 | 203071 | 727 | 65306 |
| CapableOf | 603 | 209442 | 0 | 0 |
| Causes | 906 | 209139 | 83 | 65950 |
| CausesDesire | 195 | 209850 | 30 | 66003 |
| CreatedBy | 104 | 209941 | 4 | 66029 |
| DefinedAs | 16 | 210029 | 2 | 66031 |
| Desires | 374 | 209671 | 0 | 0 |
| DistinctFrom | 1552 | 208493 | 426 | 65607 |
| Entails | 277 | 209768 | 118 | 65915 |
| HasA | 606 | 209439 | 10 | 66023 |
| HasContext | 4664 | 205381 | 1936 | 64097 |
| HasFirstSubevent | 66 | 209979 | 17 | 66016 |
| HasLastSubevent | 82 | 209963 | 14 | 66019 |
| HasPrerequisite | 586 | 209459 | 123 | 65910 |
| HasProperty | 1397 | 208648 | 0 | 0 |
| HasSubevent | 644 | 209401 | 64 | 65969 |
| InstanceOf | 1 | 210044 | 0 | 0 |
| IsA | 54028 | 156017 | 21122 | 44911 |
| LocatedNear | 21 | 210024 | 3 | 66030 |
| MadeOf | 221 | 209824 | 23 | 66010 |
| MannerOf | 8762 | 201283 | 3747 | 62286 |
| MotivatedByGoal | 282 | 209763 | 35 | 65998 |
| NotCapableOf | 17 | 210028 | 0 | 0 |
| NotDesires | 235 | 209810 | 0 | 0 |
| NotHasProperty | 74 | 209971 | 19 | 66014 |
| PartOf | 6880 | 203165 | 2629 | 63404 |
| ReceivesAction | 290 | 209755 | 0 | 0 |
| RelatedTo | 61672 | 148373 | 11356 | 54677 |
| SimilarTo | 82 | 209963 | 36 | 65997 |
| SymbolOf | 1 | 210044 | 0 | 0 |
| Synonym | 52261 | 157784 | 22391 | 43642 |
| UsedFor | 2997 | 207048 | 415 | 65618 |
| SUM | 210045 | 6.72144e+06 | 66033 | 1.58479e+06 |
### Citation Information
```
@inproceedings{speer2017conceptnet,
title={Conceptnet 5.5: An open multilingual graph of general knowledge},
author={Speer, Robyn and Chin, Joshua and Havasi, Catherine},
booktitle={Thirty-first AAAI conference on artificial intelligence},
year={2017}
}
``` |
Edoh/manim_python | ---
license: creativeml-openrail-m
---
|
frgfm/imagenette | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- apache-2.0
multilinguality: []
size_categories:
- 1K<n<10K
source_datasets:
- extended
task_categories:
- image-classification
task_ids: []
paperswithcode_id: imagenette
pretty_name: Imagenette
---
# Dataset Card for Imagenette
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/fastai/imagenette
- **Repository:** https://github.com/fastai/imagenette
- **Leaderboard:** https://paperswithcode.com/sota/image-classification-on-imagenette
### Dataset Summary
A smaller subset of 10 easily classified classes from [Imagenet](https://huggingface.co/datasets/imagenet-1k#dataset-summary), and a little more French.
This dataset was created by [Jeremy Howard](https://twitter.com/jeremyphoward), and this repository is only there to share his work on this platform. The repository owner takes no credit of any kind in the creation, curation or packaging of the dataset.
### Supported Tasks and Leaderboards
- `image-classification`: The dataset can be used to train a model for Image Classification.
### Languages
The class labels in the dataset are in English.
## Dataset Structure
### Data Instances
A data point comprises an image URL and its classification label.
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=320x320 at 0x19FA12186D8>,
'label': 'tench',
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing the image.
- `label`: the expected class label of the image.
### Data Splits
| |train|validation|
|----------|----:|---------:|
|imagenette| 9469| 3925|
## Dataset Creation
### Curation Rationale
cf. https://huggingface.co/datasets/imagenet-1k#curation-rationale
### Source Data
#### Initial Data Collection and Normalization
Imagenette is a subset of [ImageNet](https://huggingface.co/datasets/imagenet-1k). Information about data collection of the source data can be found [here](https://huggingface.co/datasets/imagenet-1k#initial-data-collection-and-normalization).
### Annotations
#### Annotation process
cf. https://huggingface.co/datasets/imagenet-1k#annotation-process
#### Who are the annotators?
cf. https://huggingface.co/datasets/imagenet-1k#who-are-the-annotators
### Personal and Sensitive Information
cf. https://huggingface.co/datasets/imagenet-1k#personal-and-sensitive-information
## Considerations for Using the Data
### Social Impact of Dataset
cf. https://huggingface.co/datasets/imagenet-1k#social-impact-of-dataset
### Discussion of Biases
cf. https://huggingface.co/datasets/imagenet-1k#discussion-of-biases
### Other Known Limitations
cf. https://huggingface.co/datasets/imagenet-1k#other-known-limitations
## Additional Information
### Dataset Curators
cf. https://huggingface.co/datasets/imagenet-1k#dataset-curators
and Jeremy Howard
### Licensing Information
[Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```
@software{Howard_Imagenette_2019,
title={Imagenette: A smaller subset of 10 easily classified classes from Imagenet},
author={Jeremy Howard},
year={2019},
month={March},
publisher = {GitHub},
url = {https://github.com/fastai/imagenette}
}
```
### Contributions
This dataset was created by [Jeremy Howard](https://twitter.com/jeremyphoward) and published on [Github](https://github.com/fastai/imagenette). It was then only integrated into HuggingFace Datasets by [@frgfm](https://huggingface.co/frgfm).
|
dfalbel/github-r-repos | ---
license: other
task_categories:
- text-generation
language:
- code
pretty_name: github-r-repos
size_categories:
- 100K<n<1M
---
## GitHub R repositories dataset
R source files from GitHub.
This dataset has been created using the public GitHub datasets from Google BigQuery.
This is the actual query that has been used to export the data:
```
EXPORT DATA
OPTIONS (
uri = 'gs://your-bucket/gh-r/*.parquet',
format = 'PARQUET') as
(
select
f.id, f.repo_name, f.path,
c.content, c.size
from (
SELECT distinct
id, repo_name, path
FROM `bigquery-public-data.github_repos.files`
where ends_with(path, ".R")
) as f
left join `bigquery-public-data.github_repos.contents` as c on f.id = c.id
)
EXPORT_DATA
OPTIONS (
uri = 'gs://your-bucket/licenses.parquet',
format = 'PARQUET') as
(select * from `bigquery-public-data.github_repos.licenses`)
```
Files were then exported and processed locally with files in the root of this repository.
Datasets in this repository contain data from reositories with different licenses.
The data schema is:
```
id: string
repo_name: string
path: string
content: string
size: int32
license: string
```
Last updated: Jun 6th 2023
|
sheepy928/Purdue_reddit_posts_1500_unlabelled | ---
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: created_utc
dtype: timestamp[ns]
- name: url
dtype: string
- name: author
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 504948
num_examples: 1500
download_size: 321568
dataset_size: 504948
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Purdue_reddit_posts_1500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sonup/my_first_dataset | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base | ---
pretty_name: Evaluation run of Suprit/Zhongjing-LLaMA-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48560692489095947,\n\
\ \"acc_stderr\": 0.03450713063212824,\n \"acc_norm\": 0.48879741973292123,\n\
\ \"acc_norm_stderr\": 0.03524925803152966,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n\
\ \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5196245733788396,\n \"acc_stderr\": 0.0146001320759471,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6026687910774746,\n\
\ \"acc_stderr\": 0.004883455188908963,\n \"acc_norm\": 0.7971519617606054,\n\
\ \"acc_norm_stderr\": 0.004012984497778308\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970275,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970275\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n\
\ \"acc_stderr\": 0.028327743091561074,\n \"acc_norm\": 0.5451612903225806,\n\
\ \"acc_norm_stderr\": 0.028327743091561074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799595,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799595\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n\
\ \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536027,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536027\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138938,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138938\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.016617501738763397,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.016617501738763397\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.028274359854894248,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.028274359854894248\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.02748747298087159,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.02748747298087159\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878645,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878645\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n\
\ \"acc_stderr\": 0.012150699768228553,\n \"acc_norm\": 0.34615384615384615,\n\
\ \"acc_norm_stderr\": 0.012150699768228553\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261443,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n\
\ \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2608036391205459,\n \
\ \"acc_stderr\": 0.012094252417332734\n }\n}\n```"
repo_url: https://huggingface.co/Suprit/Zhongjing-LLaMA-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- '**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet'
- config_name: results
data_files:
- split: 2024_01_14T06_48_13.310278
path:
- results_2024-01-14T06-48-13.310278.parquet
- split: latest
path:
- results_2024-01-14T06-48-13.310278.parquet
---
# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48560692489095947,
"acc_stderr": 0.03450713063212824,
"acc_norm": 0.48879741973292123,
"acc_norm_stderr": 0.03524925803152966,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|arc:challenge|25": {
"acc": 0.5196245733788396,
"acc_stderr": 0.0146001320759471,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097664
},
"harness|hellaswag|10": {
"acc": 0.6026687910774746,
"acc_stderr": 0.004883455188908963,
"acc_norm": 0.7971519617606054,
"acc_norm_stderr": 0.004012984497778308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970275,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970275
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799595,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799595
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536027,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536027
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138938,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138938
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763397,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.028274359854894248,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.028274359854894248
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.02748747298087159,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.02748747298087159
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878645,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878645
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228553,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228553
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261443,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
},
"harness|gsm8k|5": {
"acc": 0.2608036391205459,
"acc_stderr": 0.012094252417332734
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3 | ---
pretty_name: Evaluation run of Juniplayground/Mist_LLaMA-2-7B-1024_V3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Juniplayground/Mist_LLaMA-2-7B-1024_V3](https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T16:28:04.317778](https://huggingface.co/datasets/open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3/blob/main/results_2023-10-28T16-28-04.317778.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268446,\n \"f1\": 0.05569840604026855,\n\
\ \"f1_stderr\": 0.0013255684995797806,\n \"acc\": 0.39087485257361143,\n\
\ \"acc_stderr\": 0.009174257360532706\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268446,\n\
\ \"f1\": 0.05569840604026855,\n \"f1_stderr\": 0.0013255684995797806\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04852160727824109,\n \
\ \"acc_stderr\": 0.00591846861892108\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144333\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T16_28_04.317778
path:
- '**/details_harness|drop|3_2023-10-28T16-28-04.317778.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T16-28-04.317778.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T16_28_04.317778
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-28-04.317778.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-28-04.317778.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-13-03.510768.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-13-03.510768.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T16_28_04.317778
path:
- '**/details_harness|winogrande|5_2023-10-28T16-28-04.317778.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T16-28-04.317778.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_13_03.510768
path:
- results_2023-10-04T07-13-03.510768.parquet
- split: 2023_10_28T16_28_04.317778
path:
- results_2023-10-28T16-28-04.317778.parquet
- split: latest
path:
- results_2023-10-28T16-28-04.317778.parquet
---
# Dataset Card for Evaluation run of Juniplayground/Mist_LLaMA-2-7B-1024_V3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Juniplayground/Mist_LLaMA-2-7B-1024_V3](https://huggingface.co/Juniplayground/Mist_LLaMA-2-7B-1024_V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T16:28:04.317778](https://huggingface.co/datasets/open-llm-leaderboard/details_Juniplayground__Mist_LLaMA-2-7B-1024_V3/blob/main/results_2023-10-28T16-28-04.317778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268446,
"f1": 0.05569840604026855,
"f1_stderr": 0.0013255684995797806,
"acc": 0.39087485257361143,
"acc_stderr": 0.009174257360532706
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268446,
"f1": 0.05569840604026855,
"f1_stderr": 0.0013255684995797806
},
"harness|gsm8k|5": {
"acc": 0.04852160727824109,
"acc_stderr": 0.00591846861892108
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144333
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-13B-Python-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-13B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T10:58:59.562452](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16/blob/main/results_2023-10-22T10-58-59.562452.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857104,\n \"f1\": 0.04942743288590626,\n\
\ \"f1_stderr\": 0.001208970062104149,\n \"acc\": 0.3874335571481828,\n\
\ \"acc_stderr\": 0.01073390691452439\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857104,\n\
\ \"f1\": 0.04942743288590626,\n \"f1_stderr\": 0.001208970062104149\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \
\ \"acc_stderr\": 0.008294031192126591\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6740331491712708,\n \"acc_stderr\": 0.013173782636922189\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T10_58_59.562452
path:
- '**/details_harness|drop|3_2023-10-22T10-58-59.562452.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T10-58-59.562452.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T10_58_59.562452
path:
- '**/details_harness|gsm8k|5_2023-10-22T10-58-59.562452.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T10-58-59.562452.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T10_58_59.562452
path:
- '**/details_harness|winogrande|5_2023-10-22T10-58-59.562452.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T10-58-59.562452.parquet'
- config_name: results
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- results_2023-08-25T19:26:38.056569.parquet
- split: 2023_10_22T10_58_59.562452
path:
- results_2023-10-22T10-58-59.562452.parquet
- split: latest
path:
- results_2023-10-22T10-58-59.562452.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-13B-Python-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-13B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T10:58:59.562452](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16/blob/main/results_2023-10-22T10-58-59.562452.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857104,
"f1": 0.04942743288590626,
"f1_stderr": 0.001208970062104149,
"acc": 0.3874335571481828,
"acc_stderr": 0.01073390691452439
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857104,
"f1": 0.04942743288590626,
"f1_stderr": 0.001208970062104149
},
"harness|gsm8k|5": {
"acc": 0.10083396512509477,
"acc_stderr": 0.008294031192126591
},
"harness|winogrande|5": {
"acc": 0.6740331491712708,
"acc_stderr": 0.013173782636922189
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_233 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1120382576
num_examples: 220028
download_size: 1144573483
dataset_size: 1120382576
---
# Dataset Card for "chunk_233"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UdhayBrahmi/Demo_70K | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 317060507
num_examples: 70000
download_size: 155575101
dataset_size: 317060507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/lynette_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lynette (Lapis Re:LiGHTs)
This is the dataset of Lynette (Lapis Re:LiGHTs), containing 216 images and their tags.
The core tags of this character are `green_hair, short_hair, hairband, purple_eyes, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 216 | 119.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lynette_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 216 | 104.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lynette_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 409 | 184.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lynette_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 216 | 119.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lynette_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 409 | 205.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lynette_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lynette_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, closed_mouth, upper_body, portrait, school_uniform, smile, collarbone, parody |
| 1 | 5 |  |  |  |  |  | 1girl, blush, solo, holding_book, school_uniform, upper_body |
| 2 | 8 |  |  |  |  |  | 1girl, earrings, solo, blush, shirt, upper_body, headband, looking_at_viewer |
| 3 | 7 |  |  |  |  |  | 1girl, dress, earrings, solo, looking_at_viewer |
| 4 | 8 |  |  |  |  |  | 2girls, smile, solo_focus, upper_body, ascot, closed_mouth, frills, puffy_short_sleeves, sailor_collar, school_uniform, looking_at_viewer, blush, collarbone, red_hair, white_shirt |
| 5 | 5 |  |  |  |  |  | 2girls, blue_hair, earrings, long_hair, looking_at_viewer, blush, headband, white_shirt |
| 6 | 9 |  |  |  |  |  | 1girl, fingerless_gloves, solo, bike_shorts, black_gloves, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | closed_mouth | upper_body | portrait | school_uniform | smile | collarbone | parody | holding_book | earrings | shirt | headband | dress | 2girls | solo_focus | ascot | frills | puffy_short_sleeves | sailor_collar | red_hair | white_shirt | blue_hair | long_hair | fingerless_gloves | bike_shorts | black_gloves | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:---------------|:-------------|:-----------|:-----------------|:--------|:-------------|:---------|:---------------|:-----------|:--------|:-----------|:--------|:---------|:-------------|:--------|:---------|:----------------------|:----------------|:-----------|:--------------|:------------|:------------|:--------------------|:--------------|:---------------|:-------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | X | | | | | | | X | X | X | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | | | | | | | | X | | | X | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | | | X | X | X | X | | X | X | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | |
| 5 | 5 |  |  |  |  |  | | | X | X | | | | | | | | | X | | X | | X | | | | | | | X | X | X | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
Jhonjhorg/bangladesh_udacity | ---
configs:
- config_name: bangladesh_udacity
data_files:
- split: train
path: "image_mixed.npy"
- split: label
path: "label_mixed.npy"
---
---
bangladesh_udacity/
├── README.md
├── "image_mixed.npy"
└── "label_mixed.npy"
---
|
ChaoticNeutrals/Synthetic-Dark-RP | ---
license: agpl-3.0
---
Mature RP Dataset created by Claude-3 Opus. After some back-and-forth discussion, Claude and I came to an accord on the creation of this adult-themed Dataset. The data was created without supervision after feeding Opus LimaRP as an example dataset, this was also created fresh off the heels of the original Synthetic-RP which may have contributed to Claude's willingness to output more mature content.
# WARNING
Does contain "GPTisims" (i.e. "Shivers Down Spine") these were left in to maintain the original data. A Slopless version may be created depending on populatiry. |
open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16 | ---
pretty_name: Evaluation run of uukuguy/Orca-2-13b-f16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/Orca-2-13b-f16](https://huggingface.co/uukuguy/Orca-2-13b-f16) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:43:12.398370](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16/blob/main/results_2023-12-04T16-43-12.398370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6024963144778468,\n\
\ \"acc_stderr\": 0.03292700891541927,\n \"acc_norm\": 0.6070525664063983,\n\
\ \"acc_norm_stderr\": 0.03359636787928049,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5641081747684346,\n\
\ \"mc2_stderr\": 0.015927666604862285\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693024\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6115315674168492,\n\
\ \"acc_stderr\": 0.004864058877626273,\n \"acc_norm\": 0.7981477793268273,\n\
\ \"acc_norm_stderr\": 0.004005621755121483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.03554180368025689,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.03554180368025689\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767762,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767762\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n\
\ \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n\
\ \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n\
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584518,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584518\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900933,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900933\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455335,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5641081747684346,\n\
\ \"mc2_stderr\": 0.015927666604862285\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38968915845337376,\n \
\ \"acc_stderr\": 0.013433123236110692\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/Orca-2-13b-f16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-43-12.398370.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- '**/details_harness|winogrande|5_2023-12-04T16-43-12.398370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-43-12.398370.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_43_12.398370
path:
- results_2023-12-04T16-43-12.398370.parquet
- split: latest
path:
- results_2023-12-04T16-43-12.398370.parquet
---
# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/Orca-2-13b-f16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/Orca-2-13b-f16](https://huggingface.co/uukuguy/Orca-2-13b-f16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:43:12.398370](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16/blob/main/results_2023-12-04T16-43-12.398370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6024963144778468,
"acc_stderr": 0.03292700891541927,
"acc_norm": 0.6070525664063983,
"acc_norm_stderr": 0.03359636787928049,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5641081747684346,
"mc2_stderr": 0.015927666604862285
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693024
},
"harness|hellaswag|10": {
"acc": 0.6115315674168492,
"acc_stderr": 0.004864058877626273,
"acc_norm": 0.7981477793268273,
"acc_norm_stderr": 0.004005621755121483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.03554180368025689,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.03554180368025689
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776481,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776481
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767762,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767762
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200144,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647897,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584518,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584518
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900933,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900933
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291484,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455335,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777508,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5641081747684346,
"mc2_stderr": 0.015927666604862285
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.38968915845337376,
"acc_stderr": 0.013433123236110692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Svngoku/kikongo-french-translation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- kg
- fr
splits:
- name: train
num_bytes: 21895.010869565216
num_examples: 588
- name: test
num_bytes: 5510.989130434783
num_examples: 148
download_size: 25010
dataset_size: 27406
language:
- kg
- fr
---
# Dataset Card for "kikongo-french-translation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rootreck/so-vits-svc-4.0-ru-The_Witcher_3_Wild_Hunt | ---
language:
- ru
---
Это тренировочные данные моделей голосов персонажей из "Ведьмак 3: Дикая охота" для so-vits-svc-4.1.1
|
0x7o/value_determinant | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: Value Determinant
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_namanpundir__theus_concepttagger | ---
pretty_name: Evaluation run of namanpundir/theus_concepttagger
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [namanpundir/theus_concepttagger](https://huggingface.co/namanpundir/theus_concepttagger)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namanpundir__theus_concepttagger\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T09:19:07.563026](https://huggingface.co/datasets/open-llm-leaderboard/details_namanpundir__theus_concepttagger/blob/main/results_2024-01-19T09-19-07.563026.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23155074551271232,\n\
\ \"acc_stderr\": 0.029928741303991906,\n \"acc_norm\": 0.2318198554858782,\n\
\ \"acc_norm_stderr\": 0.030717432810767893,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4824892409622544,\n\
\ \"mc2_stderr\": 0.01656017052953912\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20733788395904437,\n \"acc_stderr\": 0.011846905782971356,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730111\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25761800438159727,\n\
\ \"acc_stderr\": 0.004364287353415452,\n \"acc_norm\": 0.2550288787094204,\n\
\ \"acc_norm_stderr\": 0.004349866376068979\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n\
\ \"mc2\": 0.4824892409622544,\n \"mc2_stderr\": 0.01656017052953912\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.48303078137332284,\n\
\ \"acc_stderr\": 0.014044390401612978\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/namanpundir/theus_concepttagger
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|arc:challenge|25_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|gsm8k|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hellaswag|10_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T09-19-07.563026.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- '**/details_harness|winogrande|5_2024-01-19T09-19-07.563026.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T09-19-07.563026.parquet'
- config_name: results
data_files:
- split: 2024_01_19T09_19_07.563026
path:
- results_2024-01-19T09-19-07.563026.parquet
- split: latest
path:
- results_2024-01-19T09-19-07.563026.parquet
---
# Dataset Card for Evaluation run of namanpundir/theus_concepttagger
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [namanpundir/theus_concepttagger](https://huggingface.co/namanpundir/theus_concepttagger) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_namanpundir__theus_concepttagger",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T09:19:07.563026](https://huggingface.co/datasets/open-llm-leaderboard/details_namanpundir__theus_concepttagger/blob/main/results_2024-01-19T09-19-07.563026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23155074551271232,
"acc_stderr": 0.029928741303991906,
"acc_norm": 0.2318198554858782,
"acc_norm_stderr": 0.030717432810767893,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.4824892409622544,
"mc2_stderr": 0.01656017052953912
},
"harness|arc:challenge|25": {
"acc": 0.20733788395904437,
"acc_stderr": 0.011846905782971356,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730111
},
"harness|hellaswag|10": {
"acc": 0.25761800438159727,
"acc_stderr": 0.004364287353415452,
"acc_norm": 0.2550288787094204,
"acc_norm_stderr": 0.004349866376068979
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.4824892409622544,
"mc2_stderr": 0.01656017052953912
},
"harness|winogrande|5": {
"acc": 0.48303078137332284,
"acc_stderr": 0.014044390401612978
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
napatswift/bkk-budget-ner-page | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
0: O
1: B-ENTRY
2: I-ENTRY
splits:
- name: train
num_bytes: 2455950.107936508
num_examples: 472
- name: test
num_bytes: 822118.8920634921
num_examples: 158
download_size: 377734
dataset_size: 3278069.0
---
# Dataset Card for "bkk-budget-ner-page"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-project-jnlpba-3af3e90f-1276248800 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jnlpba
eval_info:
task: entity_extraction
model: siddharthtumre/biobert-finetuned-jnlpba
metrics: []
dataset_name: jnlpba
dataset_config: jnlpba
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: siddharthtumre/biobert-finetuned-jnlpba
* Dataset: jnlpba
* Config: jnlpba
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@siddharthtumre](https://huggingface.co/siddharthtumre) for evaluating this model. |
Yiran0924/TryFelm | ---
language:
- en
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wetdog/parlament_parla_resnet_emb | ---
dataset_info:
features:
- name: path
dtype: string
- name: speaker_id
dtype: int64
- name: sentence
dtype: string
- name: gender
dtype:
class_label:
names:
'0': F
'1': M
- name: duration
dtype: float64
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 180990368
num_examples: 78976
- name: validation
num_bytes: 4903267
num_examples: 2150
- name: test
num_bytes: 4878519
num_examples: 2138
download_size: 166083881
dataset_size: 190772154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
pythainlp/thaigov-corpus | ---
language:
- th
license: cc0-1.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: raw
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 517445225
num_examples: 28555
download_size: 173349955
dataset_size: 517445225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# ThaiGov corpus
GitHub: [https://github.com/PyThaiNLP/thaigov-corpus](https://github.com/PyThaiNLP/thaigov-corpus)
## English
- Data from Thai government website. https://www.thaigov.go.th
- This part of PyThaiNLP Project.
- Compiled by Mr.Wannaphong Phatthiyaphaibun
- License Dataset is public domain.
## Data format
- 1 file, 1 news, which is extracted from 1 url.
```
topic
(Blank line)
content
content
content
content
content
(Blank line)
ที่มา (URL source) : http://www.thaigov.go.th/news/contents/details/NNN
```
## Thai
- เป็นข้อมูลที่รวบรวมข่าวสารจากเว็บไซต์รัฐบาลไทย https://www.thaigov.go.th
- โครงการนี้เป็นส่วนหนึ่งในแผนพัฒนา [PyThaiNLP](https://github.com/PyThaiNLP/)
- รวบรวมโดย นาย วรรณพงษ์ ภัททิยไพบูลย์
- ข้อมูลที่รวบรวมในคลังข้อความนี้เป็นสาธารณสมบัติ (public domain) ตามพ.ร.บ.ลิขสิทธิ์ พ.ศ. 2537 มาตรา 7 (สิ่งต่อไปนี้ไม่ถือว่าเป็นงานอันมีลิขสิทธิ์ตามพระราชบัญญัตินี้ (1) ข่าวประจำวัน และข้อเท็จจริงต่างๆ ที่มีลักษณะเป็นเพียงข่าวสารอันมิใช่งานในแผนกวรรณคดี แผนกวิทยาศาสตร์ หรือแผนกศิลปะ [...] (3) ระเบียบ ข้อบังคับ ประกาศ คำสั่ง คำชี้แจง และหนังสือตอบโต้ของกระทรวง ทบวง กรม หรือหน่วยงานอื่นใดของรัฐหรือของท้องถิ่น [...])
**สามารถติดตามประวัติการแก้ไขคลังข้อความนี้ได้ผ่านระบบ Git**
### จำนวนข่าว
- วันเริ่มต้นโครงการ 14 ก.พ. 2561
- รวบรวมครั้งล่าสุด 01.50 น. วันที่ 18 มีนาคม พ.ศ.2563
### รูปแบบข้อมูล
- 1 ไฟล์ 1 ข่าว ซึ่งดึงมาจาก 1 url
```
หัวเรื่อง
(บรรทัดว่าง)
เนื้อความ
เนื้อความ
เนื้อความ
เนื้อความ
เนื้อความ
(บรรทัดว่าง)
ที่มา : http://www.thaigov.go.th/news/contents/details/NNN
```
### รายละเอียดชื่อไฟล์
- ชื่อหมวดหมู่_จำนวนที่ของข่าว.txt
- มีโฟลเดอร์ 1 - 24 (ไม่มีโฟลเดอร์ที่ 13)
### Script
- run.py สำหรับเก็บข้อมูลจากหน้าเว็บ โดยจะดึงหน้าเว็บจาก url ```http://www.thaigov.go.th/news/contents/details/NNN``` โดยที่ NNN คือเลขจำนวนเต็ม
- เปลี่ยนค่าตัวแปร i ในไฟล์เป็นเลขที่ต้องการเริ่มเก็บ
- clean.py สำหรับทำความสะอาดข้อมูลเบื้องต้น โดยจะลบช่องว่างหน้าและท้ายบรรทัด ลบบรรทัดว่าง
- ```clean.py ชื่อไฟล์```
- ```clean.py ชื่อไฟล์1 ชื่อไฟล์2```
- ```clean.py *.txt```
We build Thai NLP.
PyThaiNLP |
autoevaluate/autoeval-staging-eval-project-xsum-8dc1621c-12925735 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/pegasus-xsum
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-xsum
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
japanese-asr/whisper_transcriptions.reazonspeech.all_44 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30529710099.0
num_examples: 267968
download_size: 30292906500
dataset_size: 30529710099.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
PaulineSanchez/Translation_words_and_sentences_english_french | ---
task_categories:
- translation
language:
- en
- fr
tags:
- words
- sentences
- everyday life
- casual
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
This dataset is a clean version (all NanN removed) of this dataset : https://www.kaggle.com/datasets/devicharith/language-translation-englishfrench . I'm not the person who posted it first on Kaggle.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mPLUG/DocLocal4K | ---
license: apache-2.0
---
|
flozi00/dibt_de | ---
dataset_info:
features:
- name: prompt
dtype: string
id: field
- name: quality
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: metadata
dtype: string
id: metadata
- name: avg_rating
dtype: float64
- name: num_responses
dtype: int64
- name: agreement_ratio
dtype: float64
- name: raw_responses
sequence: int64
- name: kind
dtype: string
- name: prompt_de
dtype: string
splits:
- name: train
num_bytes: 2830060
num_examples: 3677
download_size: 1323439
dataset_size: 2830060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-high_school_macroeconomics-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2508
num_examples: 5
download_size: 0
dataset_size: 2508
---
# Dataset Card for "mmlu-high_school_macroeconomics-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ykumards/open-i | ---
dataset_info:
features:
- name: uid
dtype: int64
- name: MeSH
dtype: string
- name: Problems
dtype: string
- name: image
dtype: string
- name: indication
dtype: string
- name: comparison
dtype: string
- name: findings
dtype: string
- name: impression
dtype: string
- name: img_frontal
dtype: binary
- name: img_lateral
dtype: binary
splits:
- name: train
num_bytes: 2104109741
num_examples: 3851
download_size: 2095869611
dataset_size: 2104109741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nd-4.0
language:
- en
pretty_name: Chest X-rays (Indiana University)
size_categories:
- 1K<n<10K
---
# Chest X-rays (Indiana University)
Copy of the kaggle dataset: https://www.kaggle.com/datasets/raddar/chest-xrays-indiana-university created by [raddar](https://www.kaggle.com/raddar)
---
Open access chest X-ray collection from Indiana University
Original source: https://openi.nlm.nih.gov/
Original images were downloaded in raw DICOM standard. Each image was converted to png using some post-processing:
top/bottom 0.5% DICOM pixel values were clipped (to eliminate very dark or very bright pixel outliers)
DICOM pixel values scaled linearly to fit into 0-255 range
resized to 2048 on shorter side (to fit in Kaggle dataset limits)
Metadata downloaded using available API (https://openi.nlm.nih.gov/services#searchAPIUsingGET)
Each image classified manually into frontal and lateral chest X-ray categories.
License: [Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) ](https://creativecommons.org/licenses/by-nc-nd/4.0/)
---
### Usage
The lateral and frontal images of each uid are grouped together in an example. The images are stored as bytes, and can be loaded to PIL Image using the following method
```
def load_image_from_byte_array(byte_array):
return Image.open(io.BytesIO(byte_array))
```
### Cite
Please site the original [source of the dataset](https://openi.nlm.nih.gov/).
```
@article{demner2016preparing,
title={Preparing a collection of radiology examinations for distribution and retrieval},
author={Demner-Fushman, Dina and Kohli, Marc D and Rosenman, Marc B and Shooshan, Sonya E and Rodriguez, Laritza and Antani, Sameer and Thoma, George R and McDonald, Clement J},
journal={Journal of the American Medical Informatics Association},
volume={23},
number={2},
pages={304--310},
year={2016},
publisher={Oxford University Press}
}
```
|
taesiri/GTA_V_CLIP_Embeddings | ---
license: openrail++
---
|
pengGG/kanqilaibucuo | ---
license: openrail
---
|
deven367/babylm-10M-simple_wikipedia | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9270331
num_examples: 56617
- name: valid
num_bytes: 9591764
num_examples: 60977
- name: test
num_bytes: 11102812
num_examples: 66392
download_size: 18016430
dataset_size: 29964907
---
# Dataset Card for "babylm-10M-simple_wikipedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Organika/wizard_of_wikipedia | ---
dataset_info:
features:
- name: persona
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20482563.837279905
num_examples: 9223
- name: test
num_bytes: 5121196.162720097
num_examples: 2306
download_size: 15397630
dataset_size: 25603760.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Anonymous2023/KGSum | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_159 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1243432280
num_examples: 242290
download_size: 1268760744
dataset_size: 1243432280
---
# Dataset Card for "chunk_159"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/conj_neg_facts_azaria_mitchell | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 59584.798573975044
num_examples: 448
- name: test
num_bytes: 15029.201426024956
num_examples: 113
download_size: 32145
dataset_size: 74614.0
---
# Dataset Card for "conj_neg_facts_azaria_mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yezhengli9/wmt20-en-ps | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 1580755
num_examples: 2719
download_size: 636184
dataset_size: 1580755
---
# Dataset Card for "wmt20-en-ps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/french_book_reviews_fr_prompt_sentiment_analysis | ---
language:
- fr
license:
- cc
size_categories:
- 100K<n<1M
task_categories:
- text-classification
tags:
- binary-sentiment-analysis
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- french_book_reviews
---
# french_book_reviews_fr_prompt_sentiment_analysis
## Summary
**french_book_reviews_fr_prompt_sentiment_analysis** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **270,424** rows that can be used for a binary sentiment analysis task.
The original data (without prompts) comes from the dataset [french_book_reviews](https://huggingface.co/datasets/Abirate/french_book_reviews) by Eltaief.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
28 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Commentaire : "'+review+'" Le commentaire est-il positif ou négatif ?',
"""Avis : " """+review+""" " L'avis est-il positif ou négatif ?""",
'Critique : "'+review+'" La critique est-elle positive ou négative ?',
"""Evaluation : " """+review+""" " L'évaluation est-elle positive ou négative ?""",
'Ce commentaire sur le produit est-il positif ou négatif ? \nCommentaire : "'+review+'"\nRéponse :',
'Cet avis sur le produit est-il positif ou négatif ? \nAvis : "'+review+'"\nRéponse :',
'Cette critique sur le produit est-elle positive ou négative ? \nCritique : "'+review+'"\nRéponse :',
'Cette évaluation sur le produit est-elle positive ou négative ? \nEvaluation : "'+review+'"\nRéponse :',
'Commentaire : "'+review+'"\n Ce commentaire sur le produit exprime-t-il un sentiment négatif ou positif ?',
'Avis : "'+review+'"\n Cet avis sur le produit exprime-t-il un sentiment négatif ou positif ?',
'Critique : "'+review+'"\n Cette critique sur le produit exprime-t-il un sentiment négatif ou positif ?',
'Evaluation : "'+review+'"\n Cette évaluation sur le produit exprime-t-il un sentiment négatif ou positif ?',
'Ce commentaire sur le produit a-t-il un ton négatif ou positif ? \n Commentaire : "'+review+'"\n Réponse :',
'Cet avis sur le produit a-t-il un ton négatif ou positif ? \n Avis : "'+review+'"\n Réponse :',
'Cette critique sur le produit a-t-il un ton négatif ou positif ? \n Evaluation : "'+review+'"\n Réponse :',
'Cette évaluation sur le produit a-t-il un ton négatif ou positif ? \n Avis : "'+review+'"\n Réponse :',
"""Voici un commentaire laissé par un client sur un produit. Diriez-vous qu'il est négatif ou positif ? \nCommentaire : """+review,
"""Voici un avis laissé par un client sur un produit. Diriez-vous qu'il est négatif ou positif ? \nAvis : """+review,
"""Voici une critique laissée par un client sur un produit. Diriez-vous qu'elle est négative ou positive ? \nCritique : """+review,
"""Voici une évaluation laissée par un client sur un produit. Diriez-vous qu'elle est négative ou positive ? \nEvaluation : """+review,
'Commentaire du produit : "'+review+'" Ce commentaire dépeint le produit sous un angle négatif ou positif ?',
'Avis du produit : "'+review+'" Cet avis dépeint le produit sous un angle négatif ou positif ?',
'Critique du produit : "'+review+'" Cette critique dépeint le produit sous un angle négatif ou positif ?',
'Evaluation du produit : "'+review+'" Cette évaluation dépeint le produit sous un angle négatif ou positif ?',
'Le commentaire suivant exprime quel sentiment ?\n Commentaire' +review,
"""L'avis suivant exprime quel sentiment ?\n Avis""" +review,
'La critique suivante exprime quel sentiment ?\n Critique' +review,
"""L'évaluation suivante exprime quel sentiment ?\n Evaluation""" +review
```
### Features used in the prompts
In the prompt list above, `review` and `targets` have been constructed from:
```
fbr = load_dataset('Abirate/french_book_reviews')
review = fbr['train']['reader_review'][i]
if fbr['train']['rating'][i] < 2.5:
targets.append("neg")
else :
targets.append("pos")
```
# Splits
- `train` with 270,424 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/french_book_reviews_fr_prompt_sentiment_analysis")
```
# Citation
## Original data
> @misc {abir_eltaief_2023,
author = { {Abir ELTAIEF} },
title = { french_book_reviews (Revision 534725e) },
year = 2023,
url = { https://huggingface.co/datasets/Abirate/french_book_reviews },
doi = { 10.57967/hf/1052 },
publisher = { Hugging Face }}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
[CC0: Public Domain](https://creativecommons.org/publicdomain/zero/1.0/) |
CyberHarem/ks_23_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ks_23/KS-23/KS-23 (Girls' Frontline)
This is the dataset of ks_23/KS-23/KS-23 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `breasts, orange_hair, large_breasts, yellow_eyes, ahoge, long_hair, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 16.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 21.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 14.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 28.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ks_23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, fingerless_gloves, sharp_teeth, solo, cleavage, navel, simple_background, blush, midriff, white_background, shorts, black_gloves, elbow_gloves, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | fingerless_gloves | sharp_teeth | solo | cleavage | navel | simple_background | blush | midriff | white_background | shorts | black_gloves | elbow_gloves | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:--------------|:-------|:-----------|:--------|:--------------------|:--------|:----------|:-------------------|:---------|:---------------|:---------------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0 | ---
pretty_name: Evaluation run of Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0](https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T17:22:11.552825](https://huggingface.co/datasets/open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0/blob/main/results_2024-03-22T17-22-11.552825.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38448642166678587,\n\
\ \"acc_stderr\": 0.03412711045658374,\n \"acc_norm\": 0.3882749361001093,\n\
\ \"acc_norm_stderr\": 0.03491709236659397,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.44689893094855493,\n\
\ \"mc2_stderr\": 0.014502506509597363\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4257679180887372,\n \"acc_stderr\": 0.014449464278868809,\n\
\ \"acc_norm\": 0.447098976109215,\n \"acc_norm_stderr\": 0.014529380160526845\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5071698864767975,\n\
\ \"acc_stderr\": 0.00498926836296872,\n \"acc_norm\": 0.6840270862378013,\n\
\ \"acc_norm_stderr\": 0.004639520453444027\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.039505818611799616,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.039505818611799616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n\
\ \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n\
\ \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3696969696969697,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.3696969696969697,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"\
acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43853211009174314,\n \"acc_stderr\": 0.021274713073954565,\n \"\
acc_norm\": 0.43853211009174314,\n \"acc_norm_stderr\": 0.021274713073954565\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536037,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536037\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37254901960784315,\n \"acc_stderr\": 0.03393388584958404,\n \"\
acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.03393388584958404\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.03195514741370673,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.03195514741370673\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.3991031390134529,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5454545454545454,\n \"acc_stderr\": 0.04545454545454546,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04545454545454546\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.049486373240266376,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.049486373240266376\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5159642401021711,\n\
\ \"acc_stderr\": 0.01787084750608173,\n \"acc_norm\": 0.5159642401021711,\n\
\ \"acc_norm_stderr\": 0.01787084750608173\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3786127167630058,\n \"acc_stderr\": 0.02611374936131034,\n\
\ \"acc_norm\": 0.3786127167630058,\n \"acc_norm_stderr\": 0.02611374936131034\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553988,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553988\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805407,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805407\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.41479099678456594,\n\
\ \"acc_stderr\": 0.027982680459759553,\n \"acc_norm\": 0.41479099678456594,\n\
\ \"acc_norm_stderr\": 0.027982680459759553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4382716049382716,\n \"acc_stderr\": 0.027607914087400483,\n\
\ \"acc_norm\": 0.4382716049382716,\n \"acc_norm_stderr\": 0.027607914087400483\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343968,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343968\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3116036505867014,\n\
\ \"acc_stderr\": 0.01182903918284965,\n \"acc_norm\": 0.3116036505867014,\n\
\ \"acc_norm_stderr\": 0.01182903918284965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.027576468622740522,\n\
\ \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.027576468622740522\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3545751633986928,\n \"acc_stderr\": 0.0193533605475537,\n \
\ \"acc_norm\": 0.3545751633986928,\n \"acc_norm_stderr\": 0.0193533605475537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.047093069786618966,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.047093069786618966\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4122448979591837,\n \"acc_stderr\": 0.03151236044674281,\n\
\ \"acc_norm\": 0.4122448979591837,\n \"acc_norm_stderr\": 0.03151236044674281\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.035302355173346824,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.035302355173346824\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.44689893094855493,\n\
\ \"mc2_stderr\": 0.014502506509597363\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6511444356748224,\n \"acc_stderr\": 0.01339505932013732\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09249431387414708,\n \
\ \"acc_stderr\": 0.007980396874560178\n }\n}\n```"
repo_url: https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-22-11.552825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-22-11.552825.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- '**/details_harness|winogrande|5_2024-03-22T17-22-11.552825.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T17-22-11.552825.parquet'
- config_name: results
data_files:
- split: 2024_03_22T17_22_11.552825
path:
- results_2024-03-22T17-22-11.552825.parquet
- split: latest
path:
- results_2024-03-22T17-22-11.552825.parquet
---
# Dataset Card for Evaluation run of Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0](https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T17:22:11.552825](https://huggingface.co/datasets/open-llm-leaderboard/details_Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0/blob/main/results_2024-03-22T17-22-11.552825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38448642166678587,
"acc_stderr": 0.03412711045658374,
"acc_norm": 0.3882749361001093,
"acc_norm_stderr": 0.03491709236659397,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.44689893094855493,
"mc2_stderr": 0.014502506509597363
},
"harness|arc:challenge|25": {
"acc": 0.4257679180887372,
"acc_stderr": 0.014449464278868809,
"acc_norm": 0.447098976109215,
"acc_norm_stderr": 0.014529380160526845
},
"harness|hellaswag|10": {
"acc": 0.5071698864767975,
"acc_stderr": 0.00498926836296872,
"acc_norm": 0.6840270862378013,
"acc_norm_stderr": 0.004639520453444027
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.039505818611799616,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.039505818611799616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4290322580645161,
"acc_stderr": 0.02815603653823321,
"acc_norm": 0.4290322580645161,
"acc_norm_stderr": 0.02815603653823321
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3696969696969697,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.3696969696969697,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43853211009174314,
"acc_stderr": 0.021274713073954565,
"acc_norm": 0.43853211009174314,
"acc_norm_stderr": 0.021274713073954565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536037,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536037
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.03393388584958404,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.03393388584958404
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.03195514741370673,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.03195514741370673
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04545454545454546,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04545454545454546
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.049486373240266376,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.049486373240266376
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5159642401021711,
"acc_stderr": 0.01787084750608173,
"acc_norm": 0.5159642401021711,
"acc_norm_stderr": 0.01787084750608173
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3786127167630058,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.3786127167630058,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553988,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553988
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805407,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805407
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.41479099678456594,
"acc_stderr": 0.027982680459759553,
"acc_norm": 0.41479099678456594,
"acc_norm_stderr": 0.027982680459759553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4382716049382716,
"acc_stderr": 0.027607914087400483,
"acc_norm": 0.4382716049382716,
"acc_norm_stderr": 0.027607914087400483
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343968,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343968
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3116036505867014,
"acc_stderr": 0.01182903918284965,
"acc_norm": 0.3116036505867014,
"acc_norm_stderr": 0.01182903918284965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.027576468622740522,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.027576468622740522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3545751633986928,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.3545751633986928,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.047093069786618966,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.047093069786618966
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4122448979591837,
"acc_stderr": 0.03151236044674281,
"acc_norm": 0.4122448979591837,
"acc_norm_stderr": 0.03151236044674281
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.035302355173346824,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.035302355173346824
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.44689893094855493,
"mc2_stderr": 0.014502506509597363
},
"harness|winogrande|5": {
"acc": 0.6511444356748224,
"acc_stderr": 0.01339505932013732
},
"harness|gsm8k|5": {
"acc": 0.09249431387414708,
"acc_stderr": 0.007980396874560178
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tgokhale/sr2d_visor | ---
license: cc-by-nc-nd-4.0
viewer: false
---
# Benchmarking Spatial Relationships in Text-to-Image Generation
*Tejas Gokhale, Hamid Palangi, Besmira Nushi, Vibhav Vineet, Eric Horvitz, Ece Kamar, Chitta Baral, Yezhou Yang*
- We introduce a large-scale challenge dataset SR<sub>2D</sub> that contains sentences describing two objects and the spatial relationship between them.
- We introduce a metric called VISOR (short for **V**erify**I**ng **S**patial **O**bject **R**elationships) to quantify spatial reasoning performance.
- VISOR and SR<sub>2D</sub> can be used off-the-shelf with any text-to-image model.
## SR<sub>2D</sub> Dataset
Our dataset is hosted as [here](https://huggingface.co/datasets/tgokhale/sr2d_visor). This contains
1. The text prompt dataset in `.json` format (`text_spatial_rel_phrases.json`)
2. Images generated using 7 models (GLIDE, CogView2, DALLE-mini, Stable Diffusion, GLIDE + Stable Diffusion + CDM, and Stable Diffusion v2.1)
Alternatively, the text prompt dataset can also accessed from [`text_spatial_rel_phrases.json`](https://github.com/microsoft/VISOR/blob/main/text_spatial_rel_phrases.json). It contains all examples from the current version of the dataset (31680 text prompts) accompanied by the corresponding metadata.
This dataset can also be generated by running the script `python create_spatial_phrases.py`
## GitHub repository
The GitHub repository for [VISOR](https://github.com/microsoft/VISOR/) contains code for generating images with prompts from the SR<sub>2D</sub> dataset and evaluating the generated images using VISOR.
## References
Code for text-to-image generation:
1. GLIDE: https://github.com/openai/glide-text2im
2. DALLE-mini: https://github.com/borisdayma/dalle-mini
3. CogView2: https://github.com/THUDM/CogView2
4. Stable Diffusion: https://github.com/CompVis/stable-diffusion
5. Composable Diffusion Models: https://github.com/energy-based-model/Compositional-Visual-Generation-with-Composable-Diffusion-Models-PyTorch
6. OpenAI API for DALLE-2: https://openai.com/api/
## Citation
If you find SR<sub>2D</sub> or VISOR useful in your research, please use the following citation:
```
@article{gokhale2022benchmarking,
title={Benchmarking Spatial Relationships in Text-to-Image Generation},
author={Gokhale, Tejas and Palangi, Hamid and Nushi, Besmira and Vineet, Vibhav and Horvitz, Eric and Kamar, Ece and Baral, Chitta and Yang, Yezhou},
journal={arXiv preprint arXiv:2212.10015},
year={2022}
}
``` |
sheepy928/purdue_reddit_posts_2017_2022 | ---
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: created_utc
dtype: timestamp[ns]
- name: url
dtype: string
- name: author
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 25865572
num_examples: 78849
download_size: 15617426
dataset_size: 25865572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "purdue_reddit_posts_2017_2022"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MinutoHype/Lair | ---
license: openrail
---
|
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3_sd1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 811185053
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MagedSaeed/MADBase | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 16186819.125
num_examples: 59999
- name: test
num_bytes: 2695549.125
num_examples: 9999
download_size: 15361996
dataset_size: 18882368.25
task_categories:
- image-classification
language:
- ar
pretty_name: Arabic Handwritten Digits Images Dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for MADBase
## Dataset Description
- **Homepage:**
https://datacenter.aucegypt.edu/shazeem/
- **Repository:**
- **Paper:**
A Two-Stage System for Arabic Handwritten Digit Recognition Tested on a New Large Database.
EA El-Sherif, S Abdelazeem
Artificial intelligence and pattern recognition, 237-242
- **Leaderboard:**
- **Point of Contact:**
Ezzat ezzat.elsherif@gmail.com
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Arabic
## Dataset Structure
### Data Instances
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=L size=28x28 at 0x7F5EE5B427A0>,
'label': 1,
}
### Data Fields
image: A PIL.Image.Image object containing the 28x28 image. Note that when accessing the image column: dataset[0]["image"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0]
label: an integer between 0 and 9 representing the digit.
### Data Splits
The data is split into training and test set. The training set contains, as in mnist dataset, 60,000 images and the test set 10,000 images.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is publickly available for research. Any work that uses this dataset should cite the work below in Citation Information.
### Citation Information
```
@inproceedings{el2007two,
title={A Two-Stage System for Arabic Handwritten Digit Recognition Tested on a New Large Database.},
author={El-Sherif, Ezzat Ali and Abdelazeem, Sherif},
booktitle={Artificial intelligence and pattern recognition},
pages={237--242},
year={2007}
}
```
### Contributions
[More Information Needed] |
jacquelinegrimm/ecoli-bsubtilis | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 9069568
num_examples: 17714
download_size: 4268212
dataset_size: 9069568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pdearena/Maxwell-3D | ---
license: mit
---
|
erkam/clevr-with-depth | ---
dataset_info:
features:
- name: image
dtype: image
- name: depth
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 115079852.0
num_examples: 1400
- name: test
num_bytes: 24726160.0
num_examples: 300
- name: val
num_bytes: 24696560.0
num_examples: 300
download_size: 164000762
dataset_size: 164502572.0
---
# Dataset Card for "clevr-with-depth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-7b | ---
pretty_name: Evaluation run of jondurbin/airoboros-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-7b](https://huggingface.co/jondurbin/airoboros-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T18:06:24.676047](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b/blob/main/results_2023-10-22T18-06-24.676047.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17260906040268456,\n\
\ \"em_stderr\": 0.0038701413394570546,\n \"f1\": 0.2366065436241609,\n\
\ \"f1_stderr\": 0.003924713658600719,\n \"acc\": 0.3653891607870639,\n\
\ \"acc_stderr\": 0.00836463128895662\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17260906040268456,\n \"em_stderr\": 0.0038701413394570546,\n\
\ \"f1\": 0.2366065436241609,\n \"f1_stderr\": 0.003924713658600719\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.012758813448064605\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|arc:challenge|25_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T14_45_03.527749
path:
- '**/details_harness|drop|3_2023-10-22T14-45-03.527749.parquet'
- split: 2023_10_22T18_06_24.676047
path:
- '**/details_harness|drop|3_2023-10-22T18-06-24.676047.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T18-06-24.676047.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T14_45_03.527749
path:
- '**/details_harness|gsm8k|5_2023-10-22T14-45-03.527749.parquet'
- split: 2023_10_22T18_06_24.676047
path:
- '**/details_harness|gsm8k|5_2023-10-22T18-06-24.676047.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T18-06-24.676047.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hellaswag|10_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:41.797707.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:41.797707.parquet'
- split: 2023_08_03T10_53_16.079239
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-03T10:53:16.079239.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-03T10:53:16.079239.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T14_45_03.527749
path:
- '**/details_harness|winogrande|5_2023-10-22T14-45-03.527749.parquet'
- split: 2023_10_22T18_06_24.676047
path:
- '**/details_harness|winogrande|5_2023-10-22T18-06-24.676047.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T18-06-24.676047.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_47_41.797707
path:
- results_2023-07-19T16:47:41.797707.parquet
- split: 2023_08_03T10_53_16.079239
path:
- results_2023-08-03T10:53:16.079239.parquet
- split: 2023_10_22T14_45_03.527749
path:
- results_2023-10-22T14-45-03.527749.parquet
- split: 2023_10_22T18_06_24.676047
path:
- results_2023-10-22T18-06-24.676047.parquet
- split: latest
path:
- results_2023-10-22T18-06-24.676047.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b](https://huggingface.co/jondurbin/airoboros-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T18:06:24.676047](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b/blob/main/results_2023-10-22T18-06-24.676047.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17260906040268456,
"em_stderr": 0.0038701413394570546,
"f1": 0.2366065436241609,
"f1_stderr": 0.003924713658600719,
"acc": 0.3653891607870639,
"acc_stderr": 0.00836463128895662
},
"harness|drop|3": {
"em": 0.17260906040268456,
"em_stderr": 0.0038701413394570546,
"f1": 0.2366065436241609,
"f1_stderr": 0.003924713658600719
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.012758813448064605
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/biology_dataset_standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 83889810
num_examples: 7464
download_size: 0
dataset_size: 83889810
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
M-AI-C/quran_tafseer | ---
dataset_info:
features:
- name: en-ahmedali
dtype: string
- name: en-ahmedraza
dtype: string
- name: en-arberry
dtype: string
- name: en-asad
dtype: string
- name: en-daryabadi
dtype: string
- name: en-hilali
dtype: string
- name: en-itani
dtype: string
- name: en-maududi
dtype: string
- name: en-mubarakpuri
dtype: string
- name: en-pickthall
dtype: string
- name: en-qarai
dtype: string
- name: en-qaribullah
dtype: string
- name: en-sahih
dtype: string
- name: en-sarwar
dtype: string
- name: en-shakir
dtype: string
- name: en-transliterati
dtype: string
- name: en-wahiduddi
dtype: string
- name: en-yusufali
dtype: string
- name: ayah
dtype: int64
- name: sorah
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 17616789
num_examples: 6235
download_size: 9631631
dataset_size: 17616789
---
# Dataset Card for "quran_tafseer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Test157t__Eris-Floramix-7b | ---
pretty_name: Evaluation run of Test157t/Eris-Floramix-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Eris-Floramix-7b](https://huggingface.co/Test157t/Eris-Floramix-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Eris-Floramix-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T20:04:17.875681](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Eris-Floramix-7b/blob/main/results_2024-03-07T20-04-17.875681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522949272776828,\n\
\ \"acc_stderr\": 0.03219180473630854,\n \"acc_norm\": 0.6517375885341681,\n\
\ \"acc_norm_stderr\": 0.03286628210816367,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7096103441812819,\n\
\ \"mc2_stderr\": 0.014868040276567759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520766,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7176857199761004,\n\
\ \"acc_stderr\": 0.0044920552794071094,\n \"acc_norm\": 0.8827922724556861,\n\
\ \"acc_norm_stderr\": 0.003210102507177253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n\
\ \"acc_stderr\": 0.016476342210254,\n \"acc_norm\": 0.4145251396648045,\n\
\ \"acc_norm_stderr\": 0.016476342210254\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7096103441812819,\n\
\ \"mc2_stderr\": 0.014868040276567759\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.012880360794851812\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Eris-Floramix-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|arc:challenge|25_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|gsm8k|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hellaswag|10_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T20-04-17.875681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T20-04-17.875681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- '**/details_harness|winogrande|5_2024-03-07T20-04-17.875681.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T20-04-17.875681.parquet'
- config_name: results
data_files:
- split: 2024_03_07T20_04_17.875681
path:
- results_2024-03-07T20-04-17.875681.parquet
- split: latest
path:
- results_2024-03-07T20-04-17.875681.parquet
---
# Dataset Card for Evaluation run of Test157t/Eris-Floramix-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Eris-Floramix-7b](https://huggingface.co/Test157t/Eris-Floramix-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Eris-Floramix-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T20:04:17.875681](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Eris-Floramix-7b/blob/main/results_2024-03-07T20-04-17.875681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522949272776828,
"acc_stderr": 0.03219180473630854,
"acc_norm": 0.6517375885341681,
"acc_norm_stderr": 0.03286628210816367,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7096103441812819,
"mc2_stderr": 0.014868040276567759
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520766,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.7176857199761004,
"acc_stderr": 0.0044920552794071094,
"acc_norm": 0.8827922724556861,
"acc_norm_stderr": 0.003210102507177253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210254,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7096103441812819,
"mc2_stderr": 0.014868040276567759
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272962
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
warleagle/erokhinVi | ---
license: mit
---
|
Nexdata/380000_Groups_Uighur_Chinese_Parallel_Corpus_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Uighur language and its parallel corresponding Chinese text data, 38,000 groups in total. They been cleaned, desensitized and gone through quality check. It can be used as base corpus for text data analysis in machine translation and related fields.
For more details, please refer to the link: https://www.nexdata.ai/dataset/194?source=Huggingface
# Specifications
## Storage format
TXT
## Data content
Uighur-Chinese Parallel Corpus Data
## Data size
0.38 million pairs of Uighur-Chinese Parallel Corpus Data
## Language
Uighur, Chinese
## Application scenario
machine translation
# Licensing Information
Commercial License
|
AlekseyKorshuk/up-it-ds-sft | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 196574167
num_examples: 317184
- name: validation
num_bytes: 22058238
num_examples: 35244
download_size: 135217201
dataset_size: 218632405
---
# Dataset Card for "up-it-ds-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/13ce18ce | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1322
dataset_size: 176
---
# Dataset Card for "13ce18ce"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gyoungjr/amazon-electronics-reviews | ---
dataset_info:
features:
- name: labels
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 23650276.656079177
num_examples: 86932
- name: test
num_bytes: 2628050.343920822
num_examples: 9660
download_size: 15929202
dataset_size: 26278327.0
---
# Dataset Card for "amazon-electronics-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/650_Hours_Uyghur_Spontaneous_Speech_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Uyghur(China) Real-world Casual Conversation and Monologue speech dataset, covers interview, variety show, live, etc, mirrors real-world interactions. Transcribed with text content, speaker's ID, gender, and other attributes. Our dataset was collected from extensive and diversify speakers, geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1117?source=Huggingface
# Specifications
## Format
16kHz, 16 bit, wav, mono channel;
## Content category
Including interview, variety show, live, etc;
## Recording environment
Low background noise;
## Country
China(CHN);
## Language(Region) Code
ug-CN;
## Language
Uyghur;
## Features of annotation
Transcription text, timestamp, speaker ID, gender.
## Accuracy Rate
Sentence Accuracy Rate (SAR) 95%
# Licensing Information
Commercial License
|
jhan21/amazon-food-reviews-dataset | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- expert-generated
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: amazon-food-reviews-dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- amazon
- reviews
- food reviews
- business
task_categories:
- text-classification
task_ids: []
---
# Dataset Card for "Amazon Food Reviews"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset consists of reviews of fine foods from amazon. The data span a period of more than 10 years, including all ~500,000 reviews up to October 2012. Reviews include product and user information, ratings, and a plain text review. It also includes reviews from all other Amazon categories.
### Supported Tasks and Leaderboards
This dataset can be used for numerous tasks like sentiment analysis, text classification, and user behavior analysis. It's particularly useful for training models to understand customer feedback and preferences.
### Languages
The reviews are primarily in English.
## Dataset Structure
### Data Instances
A typical data instance comprises a review with fields like product ID, user ID, rating, review text, helpfulness votes, and time of the review.
### Data Fields
- `ProductId`: Unique identifier for the product
- `UserId`: Unique identifier for the user
- `ProfileName`: Profile name of the user
- `HelpfulnessNumerator`: Number of users who found the review helpful
- `HelpfulnessDenominator`: Number of users who indicated whether they found the review helpful or not
- `Score`: Rating between 1 and 5
- `Time`: Timestamp of the review
- `Summary`: Brief summary of the review
- `Text`: Text of the review
### Data Splits
The dataset is not split into standard training/validation/testing sets. Users may need to create these splits as per their requirement.
## Dataset Creation
### Curation Rationale
The dataset was created to provide a large collection of textual reviews with sentiment labels, useful for tasks in sentiment analysis and natural language processing.
### Source Data
#### Initial Data Collection and Normalization
The data was collected from Amazon's food reviews section.
#### Who are the source language producers?
The source language producers are the Amazon users / customers who provided these reviews.
### Annotations
#### Annotation process
The reviews come with ratings that can be converted into sentiment labels, but no additional annotation process was described.
#### Who are the annotators?
The annotators are the Amazon users who left the reviews and ratings.
### Personal and Sensitive Information
The dataset contains user IDs and profile names which could potentially be used to identify the reviewers.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset provides insights into consumer preferences and sentiment, which can be valuable for businesses and researchers. However, care should be taken to ensure that models trained on this data do not reinforce stereotypes or biases present in the reviews.
### Discussion of Biases
The dataset may contain biases inherent in the user base of Amazon, which may not be representative of the general population.
### Other Known Limitations
The dataset's scope is limited to food products and may not generalize well to other types of products or reviews.
## Additional Information
### Dataset Curators
The dataset was originally curated by the SNAP group.
### Licensing Information
The dataset is available under a CC BY-SA 4.0 license.
### Citation Information
If you publish articles based on this dataset, please cite the following paper:
J. McAuley and J. Leskovec. _From amateurs to connoisseurs: modeling the evolution of user expertise through online reviews_. WWW, 2013.
### Contributions
Thanks to [@Stanford Network Analysis Project](https://www.kaggle.com/datasets/snap/amazon-fine-food-reviews/data) for adding this dataset. |
maxmendola/CS27pair | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth | ---
pretty_name: Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pythainlp/wangchanglm-7.5B-sft-enth](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T15:30:29.311096](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth/blob/main/results_2023-10-12T15-30-29.311096.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0975251677852349,\n\
\ \"em_stderr\": 0.0030381943660923163,\n \"f1\": 0.15908871644295358,\n\
\ \"f1_stderr\": 0.0032751413358900056,\n \"acc\": 0.2923141410254953,\n\
\ \"acc_stderr\": 0.007937916046478193\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0975251677852349,\n \"em_stderr\": 0.0030381943660923163,\n\
\ \"f1\": 0.15908871644295358,\n \"f1_stderr\": 0.0032751413358900056\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480626\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.579321231254933,\n \"acc_stderr\": 0.013874526372008323\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T15_30_29.311096
path:
- '**/details_harness|drop|3_2023-10-12T15-30-29.311096.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T15-30-29.311096.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T15_30_29.311096
path:
- '**/details_harness|gsm8k|5_2023-10-12T15-30-29.311096.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T15-30-29.311096.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:30:03.574829.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:30:03.574829.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T15_30_29.311096
path:
- '**/details_harness|winogrande|5_2023-10-12T15-30-29.311096.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T15-30-29.311096.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_30_03.574829
path:
- results_2023-07-18T11:30:03.574829.parquet
- split: 2023_10_12T15_30_29.311096
path:
- results_2023-10-12T15-30-29.311096.parquet
- split: latest
path:
- results_2023-10-12T15-30-29.311096.parquet
---
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-enth](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T15:30:29.311096](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth/blob/main/results_2023-10-12T15-30-29.311096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0975251677852349,
"em_stderr": 0.0030381943660923163,
"f1": 0.15908871644295358,
"f1_stderr": 0.0032751413358900056,
"acc": 0.2923141410254953,
"acc_stderr": 0.007937916046478193
},
"harness|drop|3": {
"em": 0.0975251677852349,
"em_stderr": 0.0030381943660923163,
"f1": 0.15908871644295358,
"f1_stderr": 0.0032751413358900056
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480626
},
"harness|winogrande|5": {
"acc": 0.579321231254933,
"acc_stderr": 0.013874526372008323
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HumanCompatibleAI/ppo-Pendulum-v1 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float32
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float32
splits:
- name: train
num_bytes: 2575710
num_examples: 200
download_size: 940375
dataset_size: 2575710
---
# Dataset Card for "ppo-Pendulum-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhruvrnaik/flintstones_story | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4816061959.792
num_examples: 20656
- name: test
num_bytes: 588052405.413
num_examples: 2377
- name: validation
num_bytes: 529750545.045
num_examples: 2135
download_size: 6232281749
dataset_size: 5933864910.25
---
# Dataset Card for "flintstones_story"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paullatham1/reddit-test-balanced | ---
dataset_info:
features:
- name: 'Unnamed: 0.1'
dtype: int64
- name: 'Unnamed: 0'
dtype: int64
- name: is_sarcastic
dtype: int64
- name: data
dtype: string
- name: is_sarcastic.1
dtype: int64
splits:
- name: train
num_bytes: 719656
num_examples: 9914
download_size: 451349
dataset_size: 719656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MatsuoDochiai/Sans | ---
license: openrail
---
|
bigcode/stack-exchange-preferences-20230914-clean-anonymization | ---
dataset_info:
features:
- name: qid
dtype: int64
- name: question
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: author
dtype: string
- name: author_id
dtype: int64
- name: author_profile
dtype: string
- name: pm_score
dtype: int64
- name: selected
dtype: bool
- name: text
dtype: string
- name: date
dtype: string
- name: metadata
sequence: string
splits:
- name: train
num_bytes: 37966876013
num_examples: 10404628
download_size: 17879223994
dataset_size: 37966876013
---
# Dataset Card for "stack-exchange-preferences-20230914-clean-anonymization"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Silenciorbd | ---
license: openrail
---
|
goodcoffee/covidQA_training_v2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 3651192
num_examples: 1413
download_size: 0
dataset_size: 3651192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "covidQA_training_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mesolitica/snapshot-instagram | ---
language:
- ms
---
# Snapshot Instagram
Snapshot done by https://github.com/amzar96, total size,
1. [instagram.jsonl](instagram.jsonl), 90469 rows.
2. [text-only-instragram.jsonl](text-only-instragram.jsonl), 695571 rows.
## Example data
```python
{'_index': 'instagram-v2',
'_type': '_doc',
'_id': 'instagram-v2+6+9042',
'_score': 1,
'_source': {'datetime': '2021-12-19T15:52:32',
'comment': {'date': 'Nov 26, 2021',
'url_to_reply': 'https://www.instagram.com/p/CWukPPBJ2sz/c/17943481405632363/r/17929092841798589/',
'like': 0,
'mention_to': {'total': 1, 'username': ['budnyzam']},
'text': '',
'original_time': '3w',
'time': '2021-11-26T16:07:39',
'username': 'carideeyhamad'},
'id': 8310346298799313000,
'page': {'page_details': {'profile_img': {'src': 'null', 'alt': 'null'},
'verified': True,
'url': 'https://www.instagram.com/rhbgroup/',
'username': 'rhbgroup'},
'post_details': {'date': 'Nov 26, 2021',
'url_to_post': 'https://www.instagram.com/p/CWukPPBJ2sz/',
'like': 0,
'text': 'congratulations to the winners of contest and thank you to all participants who have joined in the fun the overflow of creativity shared across was exciting to witness but ultimately the drive to spread awareness on what ride for good meant to each of you inspires us so much more don t forget to follow our page and stay tuned for more exiting contests',
'time': '2021-11-26T13:38:04',
'owner_post_details': ['rhbgroup', 'Verified'],
'username': 'rhbgroup'}},
'user': {'profile_img': {'src': 'https://instagram.fblr22-1.fna.fbcdn.net/v/t51.2885-19/s150x150/250295249_389125132945260_1828200694152772754_n.jpg?_nc_ht=instagram.fblr22-1.fna.fbcdn.net&_nc_cat=100&_nc_ohc=cM1H0fdDiwsAX8R8M49&edm=AABBvjUBAAAA&ccb=7-4&oh=00_AT83AGK3Vz42I7pPoI27lsAQcxZVWSDswzjbSod1RoOXgw&oe=61C614AE&_nc_sid=83d603',
'alt': "carideeyhamad's profile picture"},
'verified': False,
'url': 'https://www.instagram.com/carideeyhamad/',
'username': 'carideeyhamad'}}}
```
## Example data text only
```
"Roti Jala 10rb/pck #rotijala #nomnommedan #kulinermedan #rotijalamedan #rotijalakari #rotijalakarimedan"
"SUPLEMEN PENINGGI BADAN\n@supergrow.official22\n\nMau Meninggikan Badan secara Cepat, Sehat dan Alami hanya dalam Hitungan Minggu ?\n\nKenggulan Paket Peninggi Super Grow Up\n- Meninggikan Badan secara Cepat, Sehat dan Alami - Memadatkan Tulang dan Merangsang Pertumbuhan Tulang\n- Terbuat dari Kalsium Organik dengan Penyerapan Terbaik di Dunia\n- Mengandung Multivitamin Zinc, Vitamin D dan Berbagai Nutrisi untuk membantu Pertumbuhan\n- Aman Tanpa Efek Samping (MUI,BPOM,GMP)\n- Bisa dikonsumsi dari usia 12 - 35 Tahun untuk Menambahkan Tinggi Badan\n\nUmur Anda bukan Penghambat untuk Menaikkan Tinggi Badan loh\n\nMumpung masih usia 12-35 Tahun sudah saatnya Impian Anda mempunyai Tinggi Badan yg Ideal terwujud setelah Mengkonsumsi Paket Peninggi Super Grow Up kita\n\nSelain suplemen peninggi badan, kita juga menjual suplemen penambah berat badan:) Mau?\nKonsultasikan sekarang yuk di\n@supergrow.official22\nWA :+628986038290\n#wanita #omteloletom #dagelan #sexy #ppap #nutrisi #renang #savegempi #peninggibadan #pesilat #pilot #pramugari #supergrowup #caratinggi #obattinggi #tinggialami #solusipendek #polisi #tentara #indonesia #perwira #traveller #travelling #onlineshop #tiktok #viral #basket"
"Wanita PKR Melaka terkejut dan kesal dengan tindakan pemimpin cabangnya Rohani Mahmood mengumumkan menyertai Umno.\n\nKetua Wanita PKR negeri, Ginie Lim berkata punca sebenar bekas ketua Wanita cabang Tangga Batu itu akan disiasat dan dibawa kepada biro politik parti itu. \"Pengumuman mantan ketua Wanita Cabang Tangga Batu Rohani Mahmood menyertai Umno dalam sosial media Facebook adalah mengejutkan. \"Tindakan Rohani menyertai pembangkang adalah amat dikesali. Saya akan menyiasat punca sebenar dan memberi penjelasan kepada Biro Politik,\" katanya dalam suatu kenyataan hari ini.\n\nLim menjelaskan pihaknya tidak pernah menerima sebarang surat perletakan jawatan daripada Rohani.\n\nSebaliknya, kata Lim, berdasarkan peraturan Wanita PKR, Rohani secara teknikalnya masih kekal sebagai Ketua Wanita Cabang.\n\nSementara itu, Lim yang juga Adun Machap Jaya mengesahkan perletakan jawatan ketua Wanita PKR Cabang Kota Melaka Laila Maidon dibuat atas faktor kesihatan. \"Surat perletakan jawatan telah diterima dalam Mesyuarat Wanita Negeri pada 21 Mei. Perkara ini turut dimaklumkan kepada ketua Wanita Pusat, pengerusi Majlis Pimpinan Negeri dan ketua Cabang PKR Kota Melaka pada 24 Mei,\" katanya.\n#umnomalaysia\n#umnojohor\n#barisannasional \n#pemudaumnomalaysia #pemudaumnojohor #wanitaumnomalaysia \n#wanitaumnojohor \n#puteriumnomalaysia \n#puteriumnojohor \n#umnobahagianpulai\n#umnopulai\n#umnobangkit \n#partiislamsemalaysiapas"
``` |
BangumiBase/mahoushoujosite | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Mahou Shoujo Site
This is the image base of bangumi Mahou Shoujo Site, we detected 52 characters, 3729 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 47 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 34 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 19 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 52 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 11 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 28 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 56 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 172 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 100 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 52 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 69 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 624 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 102 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 12 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 11 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 34 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 13 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 33 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 43 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 107 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 14 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 54 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 6 | [Download](23/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 24 | 8 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 29 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 14 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 13 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 16 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 28 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 341 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 206 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 29 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 233 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 15 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 18 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 14 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 16 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 40 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 63 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 422 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 87 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 6 | [Download](42/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 43 | 48 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 9 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 7 | [Download](45/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 46 | 8 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 70 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 21 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 43 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 63 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 146 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
vanessa0688/ADL2023HW1 | ---
license: apache-2.0
language:
- zh
size_categories:
- 100K<n<1M
---
task_categories:
-Paragraph Selection
-Span selection |
Littlesalt33/zxy-aigc | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 388
num_examples: 6
download_size: 1589
dataset_size: 388
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tennessee_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tennessee/テネシー/田纳西 (Azur Lane)
This is the dataset of tennessee/テネシー/田纳西 (Azur Lane), containing 49 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hair_between_eyes, blue_eyes, breasts, dark_skin, hat, bangs, dark-skinned_female, peaked_cap, white_headwear, medium_breasts, crossed_bangs, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 49 | 49.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tennessee_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 49 | 29.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tennessee_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 107 | 60.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tennessee_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 49 | 44.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tennessee_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 107 | 81.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tennessee_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tennessee_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_gloves, buttons, necktie, solo, white_dress, fingerless_gloves, looking_at_viewer, microdress, simple_background, black_thighhighs, cleavage, jacket_on_shoulders, long_sleeves, short_dress, thigh_strap, white_background, closed_mouth, coat, full_body |
| 1 | 10 |  |  |  |  |  | 1girl, black_gloves, blue_necktie, buttons, fingerless_gloves, microdress, solo, standing, thigh_strap, white_dress, black_thighhighs, short_dress, white_panties, pantyshot, black_belt, jacket_on_shoulders, belt_buckle, closed_mouth, eyelashes, legs_apart, ass_visible_through_thighs, long_sleeves, looking_at_viewer, machinery, turret, blush, detached_collar, open_mouth, simple_background, straight_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | buttons | necktie | solo | white_dress | fingerless_gloves | looking_at_viewer | microdress | simple_background | black_thighhighs | cleavage | jacket_on_shoulders | long_sleeves | short_dress | thigh_strap | white_background | closed_mouth | coat | full_body | blue_necktie | standing | white_panties | pantyshot | black_belt | belt_buckle | eyelashes | legs_apart | ass_visible_through_thighs | machinery | turret | blush | detached_collar | open_mouth | straight_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------|:----------|:-------|:--------------|:--------------------|:--------------------|:-------------|:--------------------|:-------------------|:-----------|:----------------------|:---------------|:--------------|:--------------|:-------------------|:---------------|:-------|:------------|:---------------|:-----------|:----------------|:------------|:-------------|:--------------|:------------|:-------------|:-----------------------------|:------------|:---------|:--------|:------------------|:-------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
shermansiu/dm_graphcast_datasets | ---
license: cc-by-4.0
tags:
- weather-forecasting
- climate
language:
- en
pretty_name: ECMWF's ERA5, HRES, (and fake) data, formatted for DeepMind GraphCast
configs:
- config_name: source-era5_date-2022-01-01_res-0.25_levels-13_steps-01
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-13_steps-01.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-13_steps-04
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-13_steps-04.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-13_steps-12
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-13_steps-12.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-13_steps-12
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-13_steps-12.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-37_steps-01
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-37_steps-01.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-37_steps-04
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-37_steps-04.nc"
- config_name: source-era5_date-2022-01-01_res-0.25_levels-37_steps-12
data_files: "dataset/source-era5_date-2022-01-01_res-0.25_levels-37_steps-12.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-13_steps-01
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-13_steps-01.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-13_steps-04
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-13_steps-04.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-13_steps-12
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-13_steps-12.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-13_steps-20
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-13_steps-20.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-13_steps-40
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-13_steps-40.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-37_steps-01
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-37_steps-01.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-37_steps-04
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-37_steps-04.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-37_steps-12
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-37_steps-12.nc"
- config_name: source-era5_date-2022-01-01_res-1.0_levels-37_steps-20
data_files: "dataset/source-era5_date-2022-01-01_res-1.0_levels-37_steps-20.nc"
---
# ECMWF's ERA5, HRES, (and fake) data, formatted for DeepMind GraphCast
Original files are from this Google Cloud Bucket: https://console.cloud.google.com/storage/browser/dm_graphcast
This repo contains both the `dataset` and `stats` files needed for GraphCast inference.
## License and Attribution
ECMWF data products are subject to the following terms:
1. Copyright statement: Copyright "© 2023 European Centre for Medium-Range Weather Forecasts (ECMWF)".
2. Source www.ecmwf.int
3. Licence Statement: ECMWF data is published under a Creative Commons Attribution 4.0 International (CC BY 4.0). https://creativecommons.org/licenses/by/4.0/
4. Disclaimer: ECMWF does not accept any liability whatsoever for any error or omission in the data, their availability, or for any loss or damage arising from their use.
## Usage
Use the Huggingface Hub file system to load files. The `datasets` library doesn't support netCDF files yet.
```python
from huggingface_hub import HfFileSystem, hf_hub_download
import xarray
fs = HfFileSystem()
files = [
file.rsplit("/", 1)[1] for file in fs.ls("datasets/shermansiu/dm_graphcast_datasets/dataset", detail=False)
]
local_file: str = hf_hub_download(repo_id="shermansiu/dm_graphcast_datasets", filename=f"dataset/{files[0]}", repo_type="dataset")
with open(local_file, "rb") as f:
example_batch = xarray.load_dataset(f).compute()
```
## Citation
- Paper: https://www.science.org/doi/10.1126/science.adi2336
- Preprint: https://arxiv.org/abs/2212.12794
```
@article{
doi:10.1126/science.adi2336,
author = {Remi Lam and Alvaro Sanchez-Gonzalez and Matthew Willson and Peter Wirnsberger and Meire Fortunato and Ferran Alet and Suman Ravuri and Timo Ewalds and Zach Eaton-Rosen and Weihua Hu and Alexander Merose and Stephan Hoyer and George Holland and Oriol Vinyals and Jacklynn Stott and Alexander Pritzel and Shakir Mohamed and Peter Battaglia },
title = {Learning skillful medium-range global weather forecasting},
journal = {Science},
volume = {382},
number = {6677},
pages = {1416-1421},
year = {2023},
doi = {10.1126/science.adi2336},
URL = {https://www.science.org/doi/abs/10.1126/science.adi2336},
eprint = {https://www.science.org/doi/pdf/10.1126/science.adi2336},
abstract = {Global medium-range weather forecasting is critical to decision-making across many social and economic domains. Traditional numerical weather prediction uses increased compute resources to improve forecast accuracy but does not directly use historical weather data to improve the underlying model. Here, we introduce GraphCast, a machine learning–based method trained directly from reanalysis data. It predicts hundreds of weather variables for the next 10 days at 0.25° resolution globally in under 1 minute. GraphCast significantly outperforms the most accurate operational deterministic systems on 90\% of 1380 verification targets, and its forecasts support better severe event prediction, including tropical cyclone tracking, atmospheric rivers, and extreme temperatures. GraphCast is a key advance in accurate and efficient weather forecasting and helps realize the promise of machine learning for modeling complex dynamical systems. The numerical models used to predict weather are large, complex, and computationally demanding and do not learn from past weather patterns. Lam et al. introduced a machine learning–based method that has been trained directly from reanalysis data of past atmospheric conditions. In this way, the authors were able to quickly predict hundreds of weather variables globally up to 10 days in advance and at high resolution. Their predictions were more accurate than those of traditional weather models in 90\% of tested cases and displayed better severe event prediction for tropical cyclones, atmospheric rivers, and extreme temperatures. —H. Jesse Smith Machine learning leads to better, faster, and cheaper weather forecasting.}}
```
|
CyberHarem/sangonomiya_kokomi_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sangonomiya_kokomi/珊瑚宮心海/珊瑚宫心海 (Genshin Impact)
This is the dataset of sangonomiya_kokomi/珊瑚宮心海/珊瑚宫心海 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `long_hair, pink_hair, multicolored_hair, bow-shaped_hair, purple_eyes, bow, gradient_hair, blunt_bangs, very_long_hair, blue_hair, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.35 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sangonomiya_kokomi_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sangonomiya_kokomi_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1395 | 2.10 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sangonomiya_kokomi_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sangonomiya_kokomi_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, fish, frilled_sleeves, navel, short_shorts, solo, underwater, white_gloves, white_thighhighs, wide_sleeves, vision_(genshin_impact), detached_collar, white_shorts, air_bubble, half_gloves, long_sleeves, looking_at_viewer, smile, detached_sleeves, blush, closed_mouth |
| 1 | 15 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, pleated_skirt, solo, sailor_collar, serafuku, white_shirt, sidelocks, colored_tips, blue_skirt, blush, bowtie, short_sleeves, closed_mouth, smile, black_skirt, blue_bow, long_sleeves, neckerchief, ponytail, sitting, white_thighhighs |
| 2 | 5 |  |  |  |  |  | 1girl, enmaided, frills, looking_at_viewer, solo, maid_apron, maid_headdress, puffy_short_sleeves, smile, black_dress, blush, white_apron, black_footwear, blue_eyes, full_body, hair_bow, half_gloves, holding, shoes, simple_background, vision_(genshin_impact), white_background, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | fish | frilled_sleeves | navel | short_shorts | solo | underwater | white_gloves | white_thighhighs | wide_sleeves | vision_(genshin_impact) | detached_collar | white_shorts | air_bubble | half_gloves | long_sleeves | looking_at_viewer | smile | detached_sleeves | blush | closed_mouth | alternate_costume | pleated_skirt | sailor_collar | serafuku | white_shirt | sidelocks | colored_tips | blue_skirt | bowtie | short_sleeves | black_skirt | blue_bow | neckerchief | ponytail | sitting | enmaided | frills | maid_apron | maid_headdress | puffy_short_sleeves | black_dress | white_apron | black_footwear | blue_eyes | full_body | hair_bow | holding | shoes | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:------------------|:--------|:---------------|:-------|:-------------|:---------------|:-------------------|:---------------|:--------------------------|:------------------|:---------------|:-------------|:--------------|:---------------|:--------------------|:--------|:-------------------|:--------|:---------------|:--------------------|:----------------|:----------------|:-----------|:--------------|:------------|:---------------|:-------------|:---------|:----------------|:--------------|:-----------|:--------------|:-----------|:----------|:-----------|:---------|:-------------|:-----------------|:----------------------|:--------------|:--------------|:-----------------|:------------|:------------|:-----------|:----------|:--------|:--------------------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | | | | | | X | | | X | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | | | X | | | X | | X | | | | X | | X | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
IslamMesabah/CoderAPI_Dataset | ---
license: mit
tags:
- code
- 'API '
size_categories:
- n<1K
---
### Large Language Models for instructed and effective code generation using Documentation of APIs
This thesis explores the effective utilization of Large Language Models, specifically the Instruct CodeT5+ 16 Billion model, for the generation of multi-line, ready-to-execute code in Python. Departing from conventional reliance solely on pre-trained LLM knowledge, we employ API documentation to enhance the correctness of generated code for both seen and unseen APIs in the LLM knowledge. We utilize the Retrieval-Augmented Generation technique to incorporate user intents expressed in English, specifically targeting APIs, to select the most suitable segments from the relevant API documentation. Subsequently, these user intents and API documentation segments are utilized in model prompt engineering and fine-tuning procedures. We collect a newly synthesized dataset comprising 938 data points encompassing 46 distinct APIs. Furthermore, we demonstrate significant advancements in code generation accuracy and utility, resulting in a remarkable 0.2 increase in ICE score and a 0.33% elevation in CodeBLEU. Our experimental evaluation provides valuable insights into code generation complexities, including the impact of seen and unseen API documentation on model performance and the effectiveness of prompt engineering strategies. This work underscores the importance of leveraging natural language processing techniques to address real-world challenges in software engineering, with implications for automated software development and enhanced developer productivity. |
shiertier/utils | ---
license: mit
---
|
anuragiiser/PREDEX | ---
license: mit
---
|
GEM-submissions/lewtun__this-is-a-test__1647256250 | ---
benchmark: gem
type: prediction
submission_name: This is a test
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test
|
CyberHarem/hitamu_kyan_futokunoguild | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hitamu Kyan
This is the dataset of Hitamu Kyan, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 719 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 719 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 719 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 719 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Reza-Madani/output_from_hpqa_validation | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: facts
sequence: string
splits:
- name: train
num_bytes: 4546213
num_examples: 9045
download_size: 3008407
dataset_size: 4546213
---
# Dataset Card for "output_from_hpqa_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wizmak/athena_quiries | ---
license: c-uda
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.