datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
HydraLM/clustered_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float32
- name: __index_level_0__
dtype: int64
- name: cluster
sequence: int64
splits:
- name: train
num_bytes: 13588132382
num_examples: 2297193
download_size: 13051782294
dataset_size: 13588132382
---
# Dataset Card for "clustered_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-e1907042-7494832 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- clinc_oos
eval_info:
task: multi_class_classification
model: abdelkader/distilbert-base-uncased-distilled-clinc
metrics: []
dataset_name: clinc_oos
dataset_config: small
dataset_split: test
col_mapping:
text: text
target: intent
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: abdelkader/distilbert-base-uncased-distilled-clinc
* Dataset: clinc_oos
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
nizamovtimur/wikiutmn-study-gigachat | ---
license: mit
dataset_info:
features:
- name: question
dtype: string
- name: document
dtype: string
- name: human_answer
dtype: string
splits:
- name: train
num_bytes: 2087905
num_examples: 355
- name: test
num_bytes: 139365
num_examples: 67
download_size: 164688
dataset_size: 2227270
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Zhongxing0129/authorlist_test | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Austen
'1': Wilde
'2': Tolstoy
'3': Dickens
splits:
- name: train
num_bytes: 278549.13812325796
num_examples: 646
download_size: 183669
dataset_size: 278549.13812325796
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_bongchoi__test-llama2-70b | ---
pretty_name: Evaluation run of bongchoi/test-llama2-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bongchoi/test-llama2-70b](https://huggingface.co/bongchoi/test-llama2-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__test-llama2-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T14:13:10.692338](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-70b/blob/main/results_2023-10-04T14-13-10.692338.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6967225637378714,\n\
\ \"acc_stderr\": 0.030867069907791145,\n \"acc_norm\": 0.7008615431872544,\n\
\ \"acc_norm_stderr\": 0.030836865817034945,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.44923493721887353,\n\
\ \"mc2_stderr\": 0.01390226410719232\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n\
\ \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6760605457080263,\n\
\ \"acc_stderr\": 0.00467020812857923,\n \"acc_norm\": 0.8733320055765784,\n\
\ \"acc_norm_stderr\": 0.0033192094001351187\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354544,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354544\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7410256410256411,\n \"acc_stderr\": 0.02221110681006167,\n \
\ \"acc_norm\": 0.7410256410256411,\n \"acc_norm_stderr\": 0.02221110681006167\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8733944954128441,\n \"acc_stderr\": 0.014257128686165169,\n \"\
acc_norm\": 0.8733944954128441,\n \"acc_norm_stderr\": 0.014257128686165169\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6342592592592593,\n \"acc_stderr\": 0.032847388576472056,\n \"\
acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.032847388576472056\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n\
\ \"acc_stderr\": 0.016653875777524012,\n \"acc_norm\": 0.4547486033519553,\n\
\ \"acc_norm_stderr\": 0.016653875777524012\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435115,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435115\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257114,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.02955545423677884,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.02955545423677884\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5319426336375489,\n\
\ \"acc_stderr\": 0.012744149704869645,\n \"acc_norm\": 0.5319426336375489,\n\
\ \"acc_norm_stderr\": 0.012744149704869645\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.7565359477124183,\n \"acc_stderr\": 0.01736247376214662,\n \"\
acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.01736247376214662\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.0259911176728133,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.0259911176728133\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.44923493721887353,\n\
\ \"mc2_stderr\": 0.01390226410719232\n }\n}\n```"
repo_url: https://huggingface.co/bongchoi/test-llama2-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|arc:challenge|25_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hellaswag|10_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T14-13-10.692338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T14-13-10.692338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T14-13-10.692338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T14-13-10.692338.parquet'
- config_name: results
data_files:
- split: 2023_10_04T14_13_10.692338
path:
- results_2023-10-04T14-13-10.692338.parquet
- split: latest
path:
- results_2023-10-04T14-13-10.692338.parquet
---
# Dataset Card for Evaluation run of bongchoi/test-llama2-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bongchoi/test-llama2-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bongchoi/test-llama2-70b](https://huggingface.co/bongchoi/test-llama2-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bongchoi__test-llama2-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T14:13:10.692338](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-70b/blob/main/results_2023-10-04T14-13-10.692338.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6967225637378714,
"acc_stderr": 0.030867069907791145,
"acc_norm": 0.7008615431872544,
"acc_norm_stderr": 0.030836865817034945,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.44923493721887353,
"mc2_stderr": 0.01390226410719232
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.013706665975587333
},
"harness|hellaswag|10": {
"acc": 0.6760605457080263,
"acc_stderr": 0.00467020812857923,
"acc_norm": 0.8733320055765784,
"acc_norm_stderr": 0.0033192094001351187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354544,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354544
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7410256410256411,
"acc_stderr": 0.02221110681006167,
"acc_norm": 0.7410256410256411,
"acc_norm_stderr": 0.02221110681006167
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8733944954128441,
"acc_stderr": 0.014257128686165169,
"acc_norm": 0.8733944954128441,
"acc_norm_stderr": 0.014257128686165169
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.032847388576472056,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.032847388576472056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524012,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435115,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435115
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257114,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.02955545423677884,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.02955545423677884
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5319426336375489,
"acc_stderr": 0.012744149704869645,
"acc_norm": 0.5319426336375489,
"acc_norm_stderr": 0.012744149704869645
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.01736247376214662,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.01736247376214662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.0259911176728133,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.0259911176728133
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.44923493721887353,
"mc2_stderr": 0.01390226410719232
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Anon3365/bias-test-gpt-biases | ---
license: apache-2.0
---
|
dhuynh95/Magicoder-Evol-Instruct-1000-Filtered_0.5-Special-Token | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2107855
num_examples: 1000
download_size: 1112443
dataset_size: 2107855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MSL7__INEX8-7B | ---
pretty_name: Evaluation run of MSL7/INEX8-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MSL7/INEX8-7B](https://huggingface.co/MSL7/INEX8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MSL7__INEX8-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T01:25:47.449935](https://huggingface.co/datasets/open-llm-leaderboard/details_MSL7__INEX8-7B/blob/main/results_2024-03-03T01-25-47.449935.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511616938750305,\n\
\ \"acc_stderr\": 0.03206404748985042,\n \"acc_norm\": 0.6504299835030152,\n\
\ \"acc_norm_stderr\": 0.032736099716108295,\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7782906713919198,\n\
\ \"mc2_stderr\": 0.013752478952084576\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.01292893319649636\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.0044958914405194205,\n \"acc_norm\": 0.8918542123083051,\n\
\ \"acc_norm_stderr\": 0.003099297418323546\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7782906713919198,\n\
\ \"mc2_stderr\": 0.013752478952084576\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571757\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.012740305717376268\n }\n}\n```"
repo_url: https://huggingface.co/MSL7/INEX8-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|arc:challenge|25_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|gsm8k|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hellaswag|10_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T01-25-47.449935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T01-25-47.449935.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- '**/details_harness|winogrande|5_2024-03-03T01-25-47.449935.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T01-25-47.449935.parquet'
- config_name: results
data_files:
- split: 2024_03_03T01_25_47.449935
path:
- results_2024-03-03T01-25-47.449935.parquet
- split: latest
path:
- results_2024-03-03T01-25-47.449935.parquet
---
# Dataset Card for Evaluation run of MSL7/INEX8-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MSL7/INEX8-7B](https://huggingface.co/MSL7/INEX8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MSL7__INEX8-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T01:25:47.449935](https://huggingface.co/datasets/open-llm-leaderboard/details_MSL7__INEX8-7B/blob/main/results_2024-03-03T01-25-47.449935.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511616938750305,
"acc_stderr": 0.03206404748985042,
"acc_norm": 0.6504299835030152,
"acc_norm_stderr": 0.032736099716108295,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7782906713919198,
"mc2_stderr": 0.013752478952084576
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.01292893319649636
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.0044958914405194205,
"acc_norm": 0.8918542123083051,
"acc_norm_stderr": 0.003099297418323546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7782906713919198,
"mc2_stderr": 0.013752478952084576
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571757
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Eldog333/Me | ---
license: other
---
|
eswanYS/hgtest | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
stevhliu/demo | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
- text2text-generation
task_ids: []
tags:
- conditional-text-generation
---
# Dataset Card for Demo
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a demo dataset with two files `train.csv` and `test.csv`.
Load it by:
```python
from datasets import load_dataset
data_files = {"train": "train.csv", "test": "test.csv"}
demo = load_dataset("stevhliu/demo", data_files=data_files)
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-7b7f8a-16126218 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
Ashish08/vada-sambhar | ---
license: apache-2.0
language:
- en
pretty_name: vada sambhar
size_categories:
- n<1K
source_datasets:
- google
tags:
- 'images '
- food
- vada sambhar
- dreambooth-hackathon
---
# Dataset Card for vada-sambhar
## Dataset Description
The dataset contains of images of my favorite south indian dish - Vada Sambhar.
### Dataset Curators
The data has been downloaded from Google images.
### Licensing Information
The vada-sambhar dataset version 1.0.0 is released under the Apache-2.0 License. |
mwitiderrick/lamini_mistral | ---
license: apache-2.0
language:
- en
---
# Lamini Mistral
Lamini Docs dataset formatted for fine-tuning with Mistral-7B Instruct model |
gayanin/woz-noised-with-prob-dist-v2 | ---
dataset_info:
- config_name: babylon-prob-01
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2630935
num_examples: 20304
- name: test
num_bytes: 326013
num_examples: 2538
- name: validation
num_bytes: 328959
num_examples: 2539
download_size: 1730992
dataset_size: 3285907
- config_name: babylon-prob-02
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2587653
num_examples: 20304
- name: test
num_bytes: 319619
num_examples: 2538
- name: validation
num_bytes: 323916
num_examples: 2539
download_size: 1773744
dataset_size: 3231188
- config_name: babylon-prob-03
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2543333
num_examples: 20304
- name: test
num_bytes: 314849
num_examples: 2538
- name: validation
num_bytes: 318386
num_examples: 2539
download_size: 1803480
dataset_size: 3176568
- config_name: babylon-prob-04
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2498217
num_examples: 20304
- name: test
num_bytes: 310289
num_examples: 2538
- name: validation
num_bytes: 314195
num_examples: 2539
download_size: 1826199
dataset_size: 3122701
- config_name: babylon-prob-05
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2457853
num_examples: 20304
- name: test
num_bytes: 306097
num_examples: 2538
- name: validation
num_bytes: 307569
num_examples: 2539
download_size: 1844172
dataset_size: 3071519
- config_name: gcd-prob-01
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2580319
num_examples: 20304
- name: test
num_bytes: 326137
num_examples: 2538
- name: validation
num_bytes: 314447
num_examples: 2539
download_size: 1672612
dataset_size: 3220903
- config_name: gcd-prob-02
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2488852
num_examples: 20304
- name: test
num_bytes: 314869
num_examples: 2538
- name: validation
num_bytes: 302499
num_examples: 2539
download_size: 1659272
dataset_size: 3106220
- config_name: gcd-prob-03
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2397420
num_examples: 20304
- name: test
num_bytes: 303076
num_examples: 2538
- name: validation
num_bytes: 291223
num_examples: 2539
download_size: 1637199
dataset_size: 2991719
- config_name: gcd-prob-04
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2306973
num_examples: 20304
- name: test
num_bytes: 291188
num_examples: 2538
- name: validation
num_bytes: 280562
num_examples: 2539
download_size: 1608211
dataset_size: 2878723
- config_name: gcd-prob-05
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2217222
num_examples: 20304
- name: test
num_bytes: 279583
num_examples: 2538
- name: validation
num_bytes: 271343
num_examples: 2539
download_size: 1574265
dataset_size: 2768148
- config_name: kaggle-prob-01
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2575089
num_examples: 20304
- name: test
num_bytes: 318605
num_examples: 2538
- name: validation
num_bytes: 322727
num_examples: 2538
download_size: 1666313
dataset_size: 3216421
- config_name: kaggle-prob-02
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2490036
num_examples: 20304
- name: test
num_bytes: 308433
num_examples: 2538
- name: validation
num_bytes: 310492
num_examples: 2538
download_size: 1660616
dataset_size: 3108961
- config_name: kaggle-prob-03
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2404198
num_examples: 20304
- name: test
num_bytes: 297671
num_examples: 2538
- name: validation
num_bytes: 300859
num_examples: 2538
download_size: 1643086
dataset_size: 3002728
- config_name: kaggle-prob-04
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2316877
num_examples: 20304
- name: test
num_bytes: 286791
num_examples: 2538
- name: validation
num_bytes: 289662
num_examples: 2538
download_size: 1618093
dataset_size: 2893330
- config_name: kaggle-prob-05
features:
- name: 'Unnamed: 0'
dtype: int64
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 2231108
num_examples: 20304
- name: test
num_bytes: 276030
num_examples: 2538
- name: validation
num_bytes: 279656
num_examples: 2538
download_size: 1589492
dataset_size: 2786794
configs:
- config_name: babylon-prob-01
data_files:
- split: train
path: babylon-prob-01/train-*
- split: test
path: babylon-prob-01/test-*
- split: validation
path: babylon-prob-01/validation-*
- config_name: babylon-prob-02
data_files:
- split: train
path: babylon-prob-02/train-*
- split: test
path: babylon-prob-02/test-*
- split: validation
path: babylon-prob-02/validation-*
- config_name: babylon-prob-03
data_files:
- split: train
path: babylon-prob-03/train-*
- split: test
path: babylon-prob-03/test-*
- split: validation
path: babylon-prob-03/validation-*
- config_name: babylon-prob-04
data_files:
- split: train
path: babylon-prob-04/train-*
- split: test
path: babylon-prob-04/test-*
- split: validation
path: babylon-prob-04/validation-*
- config_name: babylon-prob-05
data_files:
- split: train
path: babylon-prob-05/train-*
- split: test
path: babylon-prob-05/test-*
- split: validation
path: babylon-prob-05/validation-*
- config_name: gcd-prob-01
data_files:
- split: train
path: gcd-prob-01/train-*
- split: test
path: gcd-prob-01/test-*
- split: validation
path: gcd-prob-01/validation-*
- config_name: gcd-prob-02
data_files:
- split: train
path: gcd-prob-02/train-*
- split: test
path: gcd-prob-02/test-*
- split: validation
path: gcd-prob-02/validation-*
- config_name: gcd-prob-03
data_files:
- split: train
path: gcd-prob-03/train-*
- split: test
path: gcd-prob-03/test-*
- split: validation
path: gcd-prob-03/validation-*
- config_name: gcd-prob-04
data_files:
- split: train
path: gcd-prob-04/train-*
- split: test
path: gcd-prob-04/test-*
- split: validation
path: gcd-prob-04/validation-*
- config_name: gcd-prob-05
data_files:
- split: train
path: gcd-prob-05/train-*
- split: test
path: gcd-prob-05/test-*
- split: validation
path: gcd-prob-05/validation-*
- config_name: kaggle-prob-01
data_files:
- split: train
path: kaggle-prob-01/train-*
- split: test
path: kaggle-prob-01/test-*
- split: validation
path: kaggle-prob-01/validation-*
- config_name: kaggle-prob-02
data_files:
- split: train
path: kaggle-prob-02/train-*
- split: test
path: kaggle-prob-02/test-*
- split: validation
path: kaggle-prob-02/validation-*
- config_name: kaggle-prob-03
data_files:
- split: train
path: kaggle-prob-03/train-*
- split: test
path: kaggle-prob-03/test-*
- split: validation
path: kaggle-prob-03/validation-*
- config_name: kaggle-prob-04
data_files:
- split: train
path: kaggle-prob-04/train-*
- split: test
path: kaggle-prob-04/test-*
- split: validation
path: kaggle-prob-04/validation-*
- config_name: kaggle-prob-05
data_files:
- split: train
path: kaggle-prob-05/train-*
- split: test
path: kaggle-prob-05/test-*
- split: validation
path: kaggle-prob-05/validation-*
---
|
daekeun-ml/mobile-icons-poc | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 19041285.0
num_examples: 140
download_size: 19031365
dataset_size: 19041285.0
---
# Dataset Card for "mobile-icons-poc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HeetShah/dataset | ---
license: mit
---
|
R-J/SPI-2M | ---
license: apache-2.0
task_categories:
- image-to-image
tags:
- medical
size_categories:
- 1M<n<10M
---
# SPI-2M
We introduce Stylized Pathology Images **SPI-2M** for stain normalisation via neural style transfer in histopathology.
For full details on dataset sourcing, creation etc please see our [paper](https://arxiv.org/abs/2403.09302)
### Dataset download
The `data` repo of this repository is organised as follows:
- sources: contains the 4096 curated source images zipped together
- targets: contains the 512 target images zipped together
- stylized: contains 512 `.npy` files, each has the same index as a corresponding target image e.g. t0001 and contains all 4096 source images transformed using target image t0001 as the style i.e. each is a `4096x1024x1024x3` array.
Each image is 1024x1024x3 extracted at 40x magnification.
To download we recommend using the HF API with [huggingface transfer](https://huggingface.co/docs/huggingface_hub/v0.21.4/package_reference/environment_variables#hfhubenablehftransfer) enabled
### Citation
If you use the data from this repository please cite our paper:
```
@misc{jewsbury2024stainfuser,
title={StainFuser: Controlling Diffusion for Faster Neural Style Transfer in Multi-Gigapixel Histology Images},
author={Robert Jewsbury and Ruoyu Wang and Abhir Bhalerao and Nasir Rajpoot and Quoc Dang Vu},
year={2024},
eprint={2403.09302},
archivePrefix={arXiv},
primaryClass={eess.IV}
}
```
|
open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat | ---
pretty_name: Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wahaha1987/llama_13b_sharegpt94k_fastchat](https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T01:15:25.210552](https://huggingface.co/datasets/open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat/blob/main/results_2023-10-13T01-15-25.210552.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07109899328859061,\n\
\ \"em_stderr\": 0.0026318194599633114,\n \"f1\": 0.13432151845637572,\n\
\ \"f1_stderr\": 0.0028813877533664808,\n \"acc\": 0.40513968332422795,\n\
\ \"acc_stderr\": 0.010090158389611751\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.07109899328859061,\n \"em_stderr\": 0.0026318194599633114,\n\
\ \"f1\": 0.13432151845637572,\n \"f1_stderr\": 0.0028813877533664808\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \
\ \"acc_stderr\": 0.007647024046603203\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620297\n\
\ }\n}\n```"
repo_url: https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T01_15_25.210552
path:
- '**/details_harness|drop|3_2023-10-13T01-15-25.210552.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T01-15-25.210552.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T01_15_25.210552
path:
- '**/details_harness|gsm8k|5_2023-10-13T01-15-25.210552.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T01-15-25.210552.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:35:52.707765.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:35:52.707765.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T01_15_25.210552
path:
- '**/details_harness|winogrande|5_2023-10-13T01-15-25.210552.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T01-15-25.210552.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_35_52.707765
path:
- results_2023-07-19T18:35:52.707765.parquet
- split: 2023_10_13T01_15_25.210552
path:
- results_2023-10-13T01-15-25.210552.parquet
- split: latest
path:
- results_2023-10-13T01-15-25.210552.parquet
---
# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wahaha1987/llama_13b_sharegpt94k_fastchat](https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T01:15:25.210552](https://huggingface.co/datasets/open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat/blob/main/results_2023-10-13T01-15-25.210552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07109899328859061,
"em_stderr": 0.0026318194599633114,
"f1": 0.13432151845637572,
"f1_stderr": 0.0028813877533664808,
"acc": 0.40513968332422795,
"acc_stderr": 0.010090158389611751
},
"harness|drop|3": {
"em": 0.07109899328859061,
"em_stderr": 0.0026318194599633114,
"f1": 0.13432151845637572,
"f1_stderr": 0.0028813877533664808
},
"harness|gsm8k|5": {
"acc": 0.0841546626231994,
"acc_stderr": 0.007647024046603203
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620297
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NetherlandsForensicInstitute/vuurwerkverkenner-data | ---
license: eupl-1.1
language:
- nl
---
# Vuurwerkverkenner
This dataset is used by the Vuurwerkverkenner, an application for linking snippets of exploded (heavy) fireworks to the type of firework that they originate from.
You may find the application at [www.vuurwerkverkenner.nl](https://www.vuurwerkverkenner.nl).
The dataset consists of different types of fireworks that were investigated in casework by the Netherlands Forensic Institute.
## Categories
If firework wrappers are very similar in their visual appearance, they may be grouped into categories.
In most cases, a wrapper is not similar to any other wrapper, and a category simply consists of one unique wrapper.
## Contents
The dataset consists of 185 categories, containing 332 unique wrappers in total. The dataset is organized as follows:
```
vuurwerkverkenner-data
└───fireworks_0
└─── wrappers
└─── 0
└─── wrapper.jpg
└─── compleet exemplaar.jpg
└─── gedemonteerd.jpg
└─── 1
└─── wrapper.jpg
└─── compleet exemplaar.jpg
└─── gedemonteerd.jpg
└───fireworks_1
└─── wrappers
└─── 0
└─── wrapper.jpg
└───meta.json.gz
```
The first level of folders (`fireworks_0`, `fireworks_1`, ...) corresponds to the category.
The third level (`0`, `1`, ...) corresponds to the wrappers within the category.
For each wrapper, one or multiple images may be present.
In each case, `wrapper.jpg` should be present, which is a scan of the full wrapper.
Additionally, other images may be present as well, such as the entire fireworks article or a schematic drawing of its contents.
In `meta.json.gz`, we store both *metadata* and *reference embeddings* for the firework categories. It has the following structure:
```
{
0: {
"embeddings": [[...], [...], ...],
"wrappers": [
0: {
"wrapper_text": "abcdef",
"article_name": "abc",
...
},
1: {...},
2: {...}
]
},
1: {...}
}
```
The first level (`meta[0]`, `meta[1]`, `meta[2]` ...) indicates the fireworks category, which corresponds to the first level of the photo folders (i.e. `meta[0]` matches the folder `fireworks_0`).
Each category contains the reference embeddings and the metadata for each wrapper.
The third level, e.g. `meta[0]['wrappers’][0]`, indicates the wrapper within the category and corresponds to the photo folder of that wrapper (e.g. `fireworks_0/wrappers/0`).
### _Reference embeddings_
For each category, reference embeddings are provided.
They are constructed by an AI model that is trained to create embeddings for (photos of) firework wrappers and snippets of exploded wrappers.
When fed to this model, photos of snippets that belong to a given category should have similar embeddings to the ones for the wrappers.
For more details about the model, see (here)[https://huggingface.co/NetherlandsForensicInstitute/vuurwerkverkenner].
### _Metadata_
For each wrapper, metadata is provided. The fields are described below.
| Field | Description |
|-------------------------------|-----------------------------------------------------|
| `text` | The full text that is on the wrapper (excluding non-Latin alphabets). |
| `endangerment` | The level of danger or risk associated with the firework. |
| `article_name` | The name of the item. |
| `firework_type` | The type of firework. |
| `firework_category` | The category or classification of the firework. |
| `tube_length` | The length of the tube. |
| `tube_diameter` | The diameter of the tube. |
| `composition_burst_charge` | The composition of the main charge of the firework. |
| `composition_other_mixtures` | The composition of other charges of the firework. |
| `mass_burst_charge` | The mass of the main charge. |
| `mass_other_mixtures` | The mass of other charges. | |
liuyanchen1015/VALUE_wikitext103_drop_aux | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 292533
num_examples: 396
- name: train
num_bytes: 131089043
num_examples: 174077
- name: validation
num_bytes: 232418
num_examples: 340
download_size: 78562593
dataset_size: 131613994
---
# Dataset Card for "VALUE_wikitext103_drop_aux"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sivan22/sefaria-hebrew | ---
dataset_info:
features:
- name: language
dtype: string
- name: title
dtype: string
- name: versionSource
dtype: string
- name: versionTitle
dtype: string
- name: status
dtype: string
- name: license
dtype: string
- name: versionTitleInHebrew
dtype: string
- name: actualLanguage
dtype: string
- name: isBaseText
dtype: bool
- name: level_1_index
dtype: float64
- name: level_2_index
dtype: float64
- name: level_3_index
dtype: float64
- name: level_4_index
dtype: float64
- name: level_5_index
dtype: float64
- name: text
dtype: string
- name: versionNotes
dtype: string
- name: versionNotesInHebrew
dtype: string
- name: method
dtype: string
- name: digitizedBySefaria
dtype: float64
- name: heversionSource
dtype: string
- name: priority
dtype: float64
- name: shortVersionTitle
dtype: string
- name: purchaseInformationImage
dtype: string
- name: purchaseInformationURL
dtype: string
splits:
- name: train
num_bytes: 1901352817
num_examples: 1955969
download_size: 544170227
dataset_size: 1901352817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sefaria-hebrew"
this dataset contains jewish texts in hebrew from the sefaria project |
Anand8078/esg_collection_5 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 9967
num_examples: 114
download_size: 5645
dataset_size: 9967
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dllllb/transactions-gender | ---
pretty_name: Prediction of client gender on card transactions
configs:
- config_name: transactions_data
data_files: transactions.csv.gz
- config_name: labels
data_files: gender_train.csv
task_categories:
- tabular-classification
tags:
- finance
---
https://www.kaggle.com/c/python-and-analyze-data-final-project/ |
narizhny/test | ---
dataset_info:
features:
- name: Name
dtype: string
- name: Surname
dtype: string
- name: Address
dtype: string
- name: City
dtype: string
- name: State
dtype: string
- name: Postcode
dtype: int64
splits:
- name: train
num_bytes: 413
num_examples: 6
download_size: 3258
dataset_size: 413
tags:
- bbb
- bbb
language:
- en
task_categories:
- translation
- mycategory123
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-waseda/JMMLU | ---
license: cc-by-nc-nd-4.0
task_categories:
- multiple-choice
- question-answering
language:
- ja
tags:
- llm
- evaluation
- Japanese
pretty_name: JMMLU
size_categories:
- 1K<n<10K
---
# JMMLU
Japanese Massive Multitask Language Understanding Benchmark
JMMLU is a four-choice question set consisting of Japanese-translated questions of a portion of MMLU ([Paper](https://arxiv.org/abs/2009.03300), [Github](https://github.com/hendrycks/test)) (Translated questions) and questions based on unique Japanese cultural context (Japanese questions). It is designed to assess the performance of large language models in Japanese.
For the translated questions, a maximum of 150 questions from each of the 57 MMLU tasks (subjects) were selected and first machine-translated into Japanese. Next, the translators checked the machine translations and removed questions and tasks that were difficult to translate, irrelevant, or inconsistent with the Japanese culture. The remaining questions were modified to make them fluent.
The Japanese questions are based on school subjects, such as Japanese civics and history, and are manually created by Japanese teachers.
The format is the same as MMLU:
```
Question, Choice A, Choice B, Choice C, Choice D, Answer
```
[Github](https://github.com/nlp-waseda/JMMLU)
The JMMLU consists of 7,536 questions in the following 56 tasks (subjects).
| Japanese Task Name | English Task Name | Number |
|---|---|---:|
| 専門医学 | professional_medicine | 150 |
| 専門心理学 | professional_psychology | 150 |
| 専門会計 | professional_accounting | 150 |
| 哲学 | philosophy | 150 |
| 雑学 | miscellaneous | 150 |
| 医学遺伝学 | medical_genetics | 99 |
| 形式論理 | formal_logic | 125 |
| 先史学 | prehistory | 150 |
| 天文学 | astronomy | 148 |
| 熟語 | japanese_idiom | 150 |
| 世界宗教 | world_religions | 147 |
| 世界事実 | global_facts | 97 |
| 世界史 | world_history | 150 |
| 社会学 | sociology | 150 |
| 栄養学 | nutrition | 149 |
| 日本史 | japanese_history | 150 |
| 日本地理 | japanese_geography | 139 |
| 人間の老化 | human_aging | 150 |
| 論理学 | logical_fallacies | 150 |
| 倫理的議論 | moral_disputes | 148 |
| 臨床知識 | clinical_knowledge | 150 |
| 経営学 | management | 102 |
| 解剖学 | anatomy | 132 |
| 計量経済学 | econometrics | 113 |
| 機械学習 | machine_learning | 111 |
| 国際法 | international_law | 120 |
| 公民 | japanese_civics | 150 |
| 公共関係 | public_relations | 109 |
| 高校心理学 | high_school_psychology | 150 |
| 高校物理 | high_school_physics | 150 |
| 高校統計学 | high_school_statistics | 150 |
| 高校数学 | high_school_mathematics | 150 |
| 高校生物学 | high_school_biology | 148 |
| 高校情報科学 | high_school_computer_science | 98 |
| 高校化学 | high_school_chemistry | 149 |
| 高校地理 | high_school_geography | 150 |
| 高校ヨーロッパ史 | high_school_european_history | 150 |
| 高校ミクロ経済学 | high_school_microeconomics | 149 |
| 高校マクロ経済学 | high_school_macroeconomics | 148 |
| 概念物理学 | conceptual_physics | 150 |
| 法理学 | jurisprudence | 107 |
| 電気工学 | electrical_engineering | 144 |
| 大学医学 | college_medicine | 150 |
| 大学物理 | college_physics | 100 |
| 大学数学 | college_mathematics | 99 |
| 大学生物学 | college_biology | 143 |
| 大学化学 | college_chemistry | 99 |
| 大学コンピュータ科学 | college_computer_science | 99 |
| 初等数学 | elementary_mathematics | 150 |
| 抽象代数 | abstract_algebra | 99 |
| マーケティング | marketing | 150 |
| ビジネス倫理 | business_ethics | 86 |
| セクシュアリティ | human_sexuality | 130 |
| セキュリティ研究 | security_studies | 150 |
| コンピュータセキュリティ | computer_security | 99 |
| ウイルス学 | virology | 150 |
The copyrights for Japanese and World History belongs to STEP Corporation. Commercial use other than for research and evaluation of language models is prohibited.
The copyrights for Japanese idioms, Japansese civics, and Japanese geography belong to New Style Cram School VIST. Commercial use is allowed only for research and evaluation of language models.
This work is licensed under CC BY-NC-ND 4.0
# Acknowledgment
We express our gratitude to the RIKEN for their support in the translation of MMLU. We also acknowledge the contributions from Step Corporation, who provided materials on Japanese and World History, and from New Style Cram School VIST, who supplied resources on japanese_idioms, japansese_civics, and japanese_geography. |
Vinnyyw/Anahivocais | ---
license: openrail
---
|
breno30/LocutorRezende | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_233 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1120367300.0
num_examples: 220025
download_size: 1144944049
dataset_size: 1120367300.0
---
# Dataset Card for "chunk_233"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Unbabel/TowerBlocks-v0.1-MT-records | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: dataset
dtype: string
- name: lp
dtype: string
- name: examples
list:
- name: answer
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56431483
num_examples: 157939
download_size: 32006492
dataset_size: 56431483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_tr_conf_gpt2_worstscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 86986
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_gpt2_worstscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
robertmyers/bpt-static | ---
license: gpl-3.0
---
|
open-llm-leaderboard/details_ehartford__Samantha-1.11-13b | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-13b](https://huggingface.co/ehartford/Samantha-1.11-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T03:12:41.840464](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-13b/blob/main/results_2023-10-22T03-12-41.840464.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.00044451099905589363,\n \"f1\": 0.06294358221476495,\n\
\ \"f1_stderr\": 0.001382016641730692,\n \"acc\": 0.4414417298508293,\n\
\ \"acc_stderr\": 0.010521593616180207\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589363,\n\
\ \"f1\": 0.06294358221476495,\n \"f1_stderr\": 0.001382016641730692\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874675\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|arc:challenge|25_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T03_12_41.840464
path:
- '**/details_harness|drop|3_2023-10-22T03-12-41.840464.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T03-12-41.840464.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T03_12_41.840464
path:
- '**/details_harness|gsm8k|5_2023-10-22T03-12-41.840464.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T03-12-41.840464.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hellaswag|10_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T08_47_37.032058
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T08:47:37.032058.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T08:47:37.032058.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T03_12_41.840464
path:
- '**/details_harness|winogrande|5_2023-10-22T03-12-41.840464.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T03-12-41.840464.parquet'
- config_name: results
data_files:
- split: 2023_10_22T03_12_41.840464
path:
- results_2023-10-22T03-12-41.840464.parquet
- split: latest
path:
- results_2023-10-22T03-12-41.840464.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-13b](https://huggingface.co/ehartford/Samantha-1.11-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T03:12:41.840464](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-13b/blob/main/results_2023-10-22T03-12-41.840464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589363,
"f1": 0.06294358221476495,
"f1_stderr": 0.001382016641730692,
"acc": 0.4414417298508293,
"acc_stderr": 0.010521593616180207
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589363,
"f1": 0.06294358221476495,
"f1_stderr": 0.001382016641730692
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874675
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3 | ---
pretty_name: Evaluation run of lmsys/vicuna-7b-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/vicuna-7b-v1.3](https://huggingface.co/lmsys/vicuna-7b-v1.3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T23:45:21.646720](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3/blob/main/results_2023-10-21T23-45-21.646720.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12730704697986578,\n\
\ \"em_stderr\": 0.003413474068983651,\n \"f1\": 0.17891254194630765,\n\
\ \"f1_stderr\": 0.0035073277688968674,\n \"acc\": 0.38083789051163464,\n\
\ \"acc_stderr\": 0.0095991004919272\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12730704697986578,\n \"em_stderr\": 0.003413474068983651,\n\
\ \"f1\": 0.17891254194630765,\n \"f1_stderr\": 0.0035073277688968674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \
\ \"acc_stderr\": 0.0063787902420996325\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754765\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/vicuna-7b-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T23_45_21.646720
path:
- '**/details_harness|drop|3_2023-10-21T23-45-21.646720.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T23-45-21.646720.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T23_45_21.646720
path:
- '**/details_harness|gsm8k|5_2023-10-21T23-45-21.646720.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T23-45-21.646720.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:22:02.219224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:22:02.219224.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:22:02.219224.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T23_45_21.646720
path:
- '**/details_harness|winogrande|5_2023-10-21T23-45-21.646720.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T23-45-21.646720.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_22_02.219224
path:
- results_2023-07-19T16:22:02.219224.parquet
- split: 2023_10_21T23_45_21.646720
path:
- results_2023-10-21T23-45-21.646720.parquet
- split: latest
path:
- results_2023-10-21T23-45-21.646720.parquet
---
# Dataset Card for Evaluation run of lmsys/vicuna-7b-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/vicuna-7b-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/vicuna-7b-v1.3](https://huggingface.co/lmsys/vicuna-7b-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T23:45:21.646720](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3/blob/main/results_2023-10-21T23-45-21.646720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12730704697986578,
"em_stderr": 0.003413474068983651,
"f1": 0.17891254194630765,
"f1_stderr": 0.0035073277688968674,
"acc": 0.38083789051163464,
"acc_stderr": 0.0095991004919272
},
"harness|drop|3": {
"em": 0.12730704697986578,
"em_stderr": 0.003413474068983651,
"f1": 0.17891254194630765,
"f1_stderr": 0.0035073277688968674
},
"harness|gsm8k|5": {
"acc": 0.05686125852918878,
"acc_stderr": 0.0063787902420996325
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754765
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MatsuoDochiai/Yasmin | ---
license: openrail
---
|
holyofferings/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5375
num_examples: 21
download_size: 5415
dataset_size: 5375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alexgoodell/llm-as-clinical-calculator | ---
license: cc-by-nc-sa-4.0
task_categories:
- question-answering
language:
- en
tags:
- anesthesiology
- medical
pretty_name: Large Language Models as Clinical Calculators
---
# Augmentation of ChatGPT with clinician-informed tools improves performance on medical calculation tasks
**Abstract**: Prior work has shown that large language models (LLMs) have the ability to answer expert-level multiple choice questions in medicine, but are limited by both their tendency to hallucinate knowledge and their inherent inadequacy in performing basic mathematical operations. Unsurprisingly, early evidence suggests that LLMs perform poorly when asked to execute common clinical calculations. Recently, it has been demonstrated that LLMs have the capability of interacting with external programs and tools, presenting a possible remedy for this limitation. In this study, we explore the ability of ChatGPT (GPT-4, November 2023) to perform medical calculations, evaluating its performance across 48 diverse clinical calculation tasks. Our findings indicate that ChatGPT is an unreliable clinical calculator, delivering inaccurate responses in one-third of trials (n=212). To address this, we developed an open-source clinical calculation API (openmedcalc.org), which we then integrated with ChatGPT. We subsequently evaluated the performance of this augmented model by comparing it against standard ChatGPT using 75 clinical vignettes in three common clinical calculation tasks: Caprini VTE Risk, Wells DVT Criteria, and MELD-Na. The augmented model demonstrated a marked improvement in accuracy over unimproved ChatGPT. Our findings suggest that integration of machine-usable, clinician-informed tools can help alleviate the reliability limitations observed in medical LLMs.
Find our preprint on [medrXiv](https://www.medrxiv.org/content/10.1101/2023.12.13.23299881v1). |
Dippi9845/arxiv-no-stop-word | ---
license: cc-by-nc-nd-4.0
---
|
CoolOppo/WizardLM_evol_instruct_V2_196k_uncensored | ---
tags:
- uncensored
- wizard
---
Uncensored version of [WizardLM_evol_instruct_V2_196k](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) (filtered the Wizard dataset and merged it with the (already uncensored) ShareGPT dataset that they link to.)
Uncensoring was done with [my rust rewrite](https://github.com/CoolOppo/wizard-clean) of the cleaner script used by [Eric Hartford](https://erichartford.com/uncensored-models) et, al. It uses all the exact same words, just compiled into a big RegEx so it runs faster. |
breno30/MilenaeMarie | ---
license: openrail
---
|
pat-jj/ClinicalTrialSummary | ---
license: openrail
dataset_info:
features:
- name: article
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 147644069
num_examples: 62012
- name: validation
num_bytes: 19781190
num_examples: 7752
- name: test
num_bytes: 19929115
num_examples: 7752
download_size: 102569528
dataset_size: 187354374
---
|
d4niel92/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Andres_Iniesta
'1': Heung-min_Son
'2': Lionel_Messi
'3': Mikaela_Shiffrin
'4': Rafael_Nadal
'5': Usain_Bolt
splits:
- name: train
num_bytes: 3222797.0
num_examples: 18
download_size: 3217445
dataset_size: 3222797.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-human_aging | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4164
num_examples: 5
- name: test
num_bytes: 619407
num_examples: 223
download_size: 90002
dataset_size: 623571
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-human_aging"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_drop_copula_be_locative | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 78797
num_examples: 366
- name: dev_mismatched
num_bytes: 73917
num_examples: 312
- name: test_matched
num_bytes: 90386
num_examples: 394
- name: test_mismatched
num_bytes: 69069
num_examples: 290
- name: train
num_bytes: 3148557
num_examples: 14125
download_size: 2175431
dataset_size: 3460726
---
# Dataset Card for "MULTI_VALUE_mnli_drop_copula_be_locative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/synthesis_data_v2 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1458784152
num_examples: 3078
download_size: 342122736
dataset_size: 1458784152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "synthesis_data_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deokhk/en_wiki_sentences_100000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 12629524
num_examples: 100000
- name: dev
num_bytes: 122796
num_examples: 1000
download_size: 7913615
dataset_size: 12752320
---
# Dataset Card for "en_wiki_sentences_100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_17_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 13102332
num_examples: 10534
download_size: 6749538
dataset_size: 13102332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_17_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SauravMaheshkar/pareto-squirrel | ---
size_categories:
- 1K<n<10K
task_categories:
- graph-ml
tags:
- art
license: cc
---
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 5,201 | 217,073 | 2,089 |
## Usage
```python
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="SauravMaheshkar/pareto-squirrel", filename="processed/squirrel.bin", local_dir="./data/", repo_type="dataset")
dataset, _ = dgl.load_graphs("./data/processed/squirrel.bin")
```
Thank you [@severo](https://huggingface.co/severo) for helping me [figure out the usage](https://discuss.huggingface.co/t/can-i-use-a-pickle-file-with-the-data-files-argument-with-datasets/72189/2?u=sauravmaheshkar).
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
```
@article{DBLP:journals/corr/abs-1909-13021,
author = {Benedek Rozemberczki and
Carl Allen and
Rik Sarkar},
title = {Multi-scale Attributed Node Embedding},
journal = {CoRR},
volume = {abs/1909.13021},
year = {2019},
url = {http://arxiv.org/abs/1909.13021},
eprinttype = {arXiv},
eprint = {1909.13021},
timestamp = {Wed, 02 Oct 2019 13:04:08 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1909-13021.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
alejaac/df_embeddings | ---
license: afl-3.0
---
|
msislam/marc-code-mixed-small | ---
dataset_info:
features:
- name: reviews
sequence: string
- name: labels
sequence: int64
- name: languages
sequence: string
- name: review_tokens
sequence: string
- name: token_labels
sequence: int64
- name: token_languages
sequence: string
- name: unique_language_count
dtype: int64
splits:
- name: train
num_bytes: 223198016
num_examples: 60000
- name: test
num_bytes: 18490176
num_examples: 5000
- name: validation
num_bytes: 18490176
num_examples: 5000
download_size: 74540072
dataset_size: 260178368
language:
- de
- en
- es
- fr
---
# marc-code-mixed-small
This dataset is based on [The Multilingual Amazon Reviews Corpus](https://huggingface.co/datasets/amazon_reviews_multi).
It contains German (DE), English (EN), Spanish (ES), and French (FR) languages.
The labels are 0 (DE), 1 (EN), 2 (ES), and 3 (FR).
Each review contains all four languages.
Total number of tokens:
* In training set: 10195342
* In test set: 842760
* In validation set: 842760 |
open-llm-leaderboard/details_chatty123__mistral_rank8_invert | ---
pretty_name: Evaluation run of chatty123/mistral_rank8_invert
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chatty123/mistral_rank8_invert](https://huggingface.co/chatty123/mistral_rank8_invert)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chatty123__mistral_rank8_invert\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T17:13:56.697737](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank8_invert/blob/main/results_2024-04-15T17-13-56.697737.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6005998436847727,\n\
\ \"acc_stderr\": 0.03350829083617501,\n \"acc_norm\": 0.6056325868152052,\n\
\ \"acc_norm_stderr\": 0.03420147815048213,\n \"mc1\": 0.4173806609547124,\n\
\ \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.5831670718324962,\n\
\ \"mc2_stderr\": 0.015208207654649508\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.014590931358120167,\n\
\ \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6159131647082254,\n\
\ \"acc_stderr\": 0.004853845750392149,\n \"acc_norm\": 0.8167695678151763,\n\
\ \"acc_norm_stderr\": 0.003860646998897283\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101074,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.016165847583563292,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.016165847583563292\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998558,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998558\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.01992211568278669,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.01992211568278669\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111844,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111844\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n\
\ \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.5831670718324962,\n\
\ \"mc2_stderr\": 0.015208207654649508\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3684609552691433,\n \
\ \"acc_stderr\": 0.01328734265167457\n }\n}\n```"
repo_url: https://huggingface.co/chatty123/mistral_rank8_invert
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-13-56.697737.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T17-13-56.697737.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- '**/details_harness|winogrande|5_2024-04-15T17-13-56.697737.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T17-13-56.697737.parquet'
- config_name: results
data_files:
- split: 2024_04_15T17_13_56.697737
path:
- results_2024-04-15T17-13-56.697737.parquet
- split: latest
path:
- results_2024-04-15T17-13-56.697737.parquet
---
# Dataset Card for Evaluation run of chatty123/mistral_rank8_invert
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chatty123/mistral_rank8_invert](https://huggingface.co/chatty123/mistral_rank8_invert) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chatty123__mistral_rank8_invert",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T17:13:56.697737](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank8_invert/blob/main/results_2024-04-15T17-13-56.697737.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6005998436847727,
"acc_stderr": 0.03350829083617501,
"acc_norm": 0.6056325868152052,
"acc_norm_stderr": 0.03420147815048213,
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.5831670718324962,
"mc2_stderr": 0.015208207654649508
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.014590931358120167,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186038
},
"harness|hellaswag|10": {
"acc": 0.6159131647082254,
"acc_stderr": 0.004853845750392149,
"acc_norm": 0.8167695678151763,
"acc_norm_stderr": 0.003860646998897283
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101074,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563292,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563292
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998558,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998558
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.01992211568278669,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.01992211568278669
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111844,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111844
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.5831670718324962,
"mc2_stderr": 0.015208207654649508
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.3684609552691433,
"acc_stderr": 0.01328734265167457
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Aneeth/job_descp_5k_samples | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: user_prompt
dtype: string
- name: model_response
dtype: string
splits:
- name: train
num_bytes: 9299904
num_examples: 5000
- name: validation
num_bytes: 182614
num_examples: 100
download_size: 2507804
dataset_size: 9482518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
jameszhou-gl/gpt-4v-distribution-shift | ---
license: mit
---
## License
This repository is licensed under the MIT License.
## Description
This Hugging Face repository hosts the random case dataset utilized in our research project, detailed in the GitHub repository [gpt-4v-distribution-shift](https://github.com/jameszhou-gl/gpt-4v-distribution-shift).
These datasets are crucial for evaluating the performance of multimodal foundation models under various distribution shift scenarios.
## Using the Dataset
For detailed instructions on how to use this dataset to reproduce the results presented in our research (specifically Tables 1 and 2), please refer to the section "Reproduce Table 1 and 2 in the paper" in our GitHub repository. The direct link to this section is [here](https://github.com/jameszhou-gl/gpt-4v-distribution-shift/tree/master?tab=readme-ov-file#reproduce-table-1-and-2-in-the-paper).
## Additional Resources
- Paper: Access our research paper on Arxiv at https://arxiv.org/pdf/2312.07424.pdf.
- GitHub Repository: For more details about our project and source code, visit https://github.com/jameszhou-gl/gpt-4v-distribution-shift. |
Exqrch/processed_bert_dataset-test | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 10029.0
num_examples: 48
download_size: 6515
dataset_size: 10029.0
---
# Dataset Card for "processed_bert_dataset-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JaspervanLeuven/aug_427 | ---
dataset_info:
features:
- name: scene_name
dtype: string
- name: ground_truth
dtype: image
- name: caption
dtype: string
- name: conditioning_images_one
dtype: image
- name: conditioning_images_two
dtype: image
- name: reference_image
dtype: string
- name: prescan_images
dtype: image
splits:
- name: train
num_bytes: 20869224.0
num_examples: 14
download_size: 20831985
dataset_size: 20869224.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
result-kand2-sdxl-wuerst-karlo/ad45b2bb | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1388
dataset_size: 188
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ad45b2bb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sonicgame/llama2_dataset_test | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vakyansh/truthfulqa_indic | ---
license: apache-2.0
task_categories:
- text-generation
language:
- hi
- pa
- te
- ta
- kn
size_categories:
- 1K<n<10K
dataset_info:
- config_name: hi
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: hi
num_examples: 817
- name: pa
num_examples: 817
- name: te
num_examples: 817
- name: ta
num_examples: 817
- name: kn
num_examples: 817
---
[Original Repository](https://github.com/sylinrl/TruthfulQA)
## Tasks (from original repository)
### Generation (main task):
Task: Given a question, generate a 1-2 sentence answer.
Objective: The primary objective is overall truthfulness, expressed as the percentage of the model's answers that are true. Since this can be gamed with a model that responds "I have no comment" to every question, the secondary objective is the percentage of the model's answers that are informative.
### Future Work:
1. Validate individual data files with Language Experts
2. Add evaluation scripts
3. Benchmark GPT3.5, GPT-4, LLaMa-2, OpenHathi
4. Add evaluation metrics |
open-llm-leaderboard/details_KoboldAI__PPO_Pygway-6b-Mix | ---
pretty_name: Evaluation run of KoboldAI/PPO_Pygway-6b-Mix
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/PPO_Pygway-6b-Mix](https://huggingface.co/KoboldAI/PPO_Pygway-6b-Mix)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__PPO_Pygway-6b-Mix\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T08:04:51.574728](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__PPO_Pygway-6b-Mix/blob/main/results_2023-10-19T08-04-51.574728.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219131,\n \"f1\": 0.05169567953020148,\n\
\ \"f1_stderr\": 0.0012499480042451514,\n \"acc\": 0.33036017216649627,\n\
\ \"acc_stderr\": 0.008492168272498208\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219131,\n\
\ \"f1\": 0.05169567953020148,\n \"f1_stderr\": 0.0012499480042451514\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \
\ \"acc_stderr\": 0.0035275958887224603\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.013456740656273954\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/PPO_Pygway-6b-Mix
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T08_04_51.574728
path:
- '**/details_harness|drop|3_2023-10-19T08-04-51.574728.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T08-04-51.574728.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T08_04_51.574728
path:
- '**/details_harness|gsm8k|5_2023-10-19T08-04-51.574728.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T08-04-51.574728.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:47:31.801752.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:47:31.801752.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:47:31.801752.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T08_04_51.574728
path:
- '**/details_harness|winogrande|5_2023-10-19T08-04-51.574728.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T08-04-51.574728.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_47_31.801752
path:
- results_2023-07-19T15:47:31.801752.parquet
- split: 2023_10_19T08_04_51.574728
path:
- results_2023-10-19T08-04-51.574728.parquet
- split: latest
path:
- results_2023-10-19T08-04-51.574728.parquet
---
# Dataset Card for Evaluation run of KoboldAI/PPO_Pygway-6b-Mix
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/PPO_Pygway-6b-Mix
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/PPO_Pygway-6b-Mix](https://huggingface.co/KoboldAI/PPO_Pygway-6b-Mix) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__PPO_Pygway-6b-Mix",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T08:04:51.574728](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__PPO_Pygway-6b-Mix/blob/main/results_2023-10-19T08-04-51.574728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219131,
"f1": 0.05169567953020148,
"f1_stderr": 0.0012499480042451514,
"acc": 0.33036017216649627,
"acc_stderr": 0.008492168272498208
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219131,
"f1": 0.05169567953020148,
"f1_stderr": 0.0012499480042451514
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224603
},
"harness|winogrande|5": {
"acc": 0.6440410418310971,
"acc_stderr": 0.013456740656273954
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
indiehackers/tenglish_wikipedia | ---
dataset_info:
features:
- name: translit
dtype: string
splits:
- name: train
num_bytes: 314781915
num_examples: 87854
download_size: 131325063
dataset_size: 314781915
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gayanin/pubmed-abstracts-noised-with-gcd-dist | ---
dataset_info:
- config_name: prob-01
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 18059111
num_examples: 74724
- name: test
num_bytes: 2313240
num_examples: 9341
- name: validation
num_bytes: 2377221
num_examples: 9341
download_size: 12720153
dataset_size: 22749572
- config_name: prob-02
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 17303468
num_examples: 74724
- name: test
num_bytes: 2215701
num_examples: 9341
- name: validation
num_bytes: 2278477
num_examples: 9341
download_size: 12401406
dataset_size: 21797646
- config_name: prob-03
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 16548709
num_examples: 74724
- name: test
num_bytes: 2119313
num_examples: 9341
- name: validation
num_bytes: 2180352
num_examples: 9341
download_size: 12046866
dataset_size: 20848374
- config_name: prob-04
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 15796152
num_examples: 74724
- name: test
num_bytes: 2023161
num_examples: 9341
- name: validation
num_bytes: 2076457
num_examples: 9341
download_size: 11644890
dataset_size: 19895770
- config_name: prob-05
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 15033370
num_examples: 74724
- name: test
num_bytes: 1927033
num_examples: 9341
- name: validation
num_bytes: 1984387
num_examples: 9341
download_size: 11205650
dataset_size: 18944790
configs:
- config_name: prob-01
data_files:
- split: train
path: prob-01/train-*
- split: test
path: prob-01/test-*
- split: validation
path: prob-01/validation-*
- config_name: prob-02
data_files:
- split: train
path: prob-02/train-*
- split: test
path: prob-02/test-*
- split: validation
path: prob-02/validation-*
- config_name: prob-03
data_files:
- split: train
path: prob-03/train-*
- split: test
path: prob-03/test-*
- split: validation
path: prob-03/validation-*
- config_name: prob-04
data_files:
- split: train
path: prob-04/train-*
- split: test
path: prob-04/test-*
- split: validation
path: prob-04/validation-*
- config_name: prob-05
data_files:
- split: train
path: prob-05/train-*
- split: test
path: prob-05/test-*
- split: validation
path: prob-05/validation-*
---
|
Brecon/Train_Test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 195875.8617511521
num_examples: 173
- name: test
num_bytes: 49818.13824884793
num_examples: 44
download_size: 143188
dataset_size: 245694.0
---
# Dataset Card for "Train_Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KirbyShrine/plainbagbean | ---
license: cc-by-nc-nd-4.0
---
|
Someman/news_nepali | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: article
dtype: string
- name: article_summary
dtype: string
splits:
- name: train
num_bytes: 63249791.75716266
num_examples: 15580
- name: test
num_bytes: 7031363.242837339
num_examples: 1732
download_size: 26039278
dataset_size: 70281155.0
---
|
open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2 | ---
pretty_name: Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T04:08:20.324415](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-25T04-08-20.324415.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094507,\n \"f1\": 0.06542994966442944,\n\
\ \"f1_stderr\": 0.001488633695023099,\n \"acc\": 0.4501858873976542,\n\
\ \"acc_stderr\": 0.010287740882080417\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094507,\n\
\ \"f1\": 0.06542994966442944,\n \"f1_stderr\": 0.001488633695023099\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.008944213403553055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ }\n}\n```"
repo_url: https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|drop|3_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T04-08-20.324415.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-08-20.324415.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|winogrande|5_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T04-08-20.324415.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- results_2023-10-09T15-57-53.203212.parquet
- split: 2023_10_25T04_08_20.324415
path:
- results_2023-10-25T04-08-20.324415.parquet
- split: latest
path:
- results_2023-10-25T04-08-20.324415.parquet
---
# Dataset Card for Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T04:08:20.324415](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-25T04-08-20.324415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06542994966442944,
"f1_stderr": 0.001488633695023099,
"acc": 0.4501858873976542,
"acc_stderr": 0.010287740882080417
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06542994966442944,
"f1_stderr": 0.001488633695023099
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553055
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B | ---
pretty_name: Evaluation run of posicube/Llama2-chat-AYB-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [posicube/Llama2-chat-AYB-13B](https://huggingface.co/posicube/Llama2-chat-AYB-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T15:23:04.071945](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B/blob/main/results_2023-10-24T15-23-04.071945.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10906040268456375,\n\
\ \"em_stderr\": 0.0031922531959087046,\n \"f1\": 0.20405201342281792,\n\
\ \"f1_stderr\": 0.003418767120803739,\n \"acc\": 0.4376976530855872,\n\
\ \"acc_stderr\": 0.010340318967318105\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10906040268456375,\n \"em_stderr\": 0.0031922531959087046,\n\
\ \"f1\": 0.20405201342281792,\n \"f1_stderr\": 0.003418767120803739\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833057\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803153\n\
\ }\n}\n```"
repo_url: https://huggingface.co/posicube/Llama2-chat-AYB-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T15_23_04.071945
path:
- '**/details_harness|drop|3_2023-10-24T15-23-04.071945.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T15-23-04.071945.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T15_23_04.071945
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-23-04.071945.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-23-04.071945.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-01.042889.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T07-48-01.042889.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T15_23_04.071945
path:
- '**/details_harness|winogrande|5_2023-10-24T15-23-04.071945.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T15-23-04.071945.parquet'
- config_name: results
data_files:
- split: 2023_10_04T07_48_01.042889
path:
- results_2023-10-04T07-48-01.042889.parquet
- split: 2023_10_24T15_23_04.071945
path:
- results_2023-10-24T15-23-04.071945.parquet
- split: latest
path:
- results_2023-10-24T15-23-04.071945.parquet
---
# Dataset Card for Evaluation run of posicube/Llama2-chat-AYB-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/posicube/Llama2-chat-AYB-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [posicube/Llama2-chat-AYB-13B](https://huggingface.co/posicube/Llama2-chat-AYB-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T15:23:04.071945](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B/blob/main/results_2023-10-24T15-23-04.071945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10906040268456375,
"em_stderr": 0.0031922531959087046,
"f1": 0.20405201342281792,
"f1_stderr": 0.003418767120803739,
"acc": 0.4376976530855872,
"acc_stderr": 0.010340318967318105
},
"harness|drop|3": {
"em": 0.10906040268456375,
"em_stderr": 0.0031922531959087046,
"f1": 0.20405201342281792,
"f1_stderr": 0.003418767120803739
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833057
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803153
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/formidable_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of formidable/フォーミダブル/可畏 (Azur Lane)
This is the dataset of formidable/フォーミダブル/可畏 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, red_eyes, very_long_hair, large_breasts, twintails, bangs, ribbon, grey_hair, hair_ribbon, between_breasts, two-tone_ribbon, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/formidable_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 484.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/formidable_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1328 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/formidable_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 853.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/formidable_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1328 | 1.64 GiB | [Download](https://huggingface.co/datasets/CyberHarem/formidable_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/formidable_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 40 |  |  |  |  |  | 1girl, official_alternate_costume, solo, white_dress, black_choker, looking_at_viewer, hair_ornament, black_sailor_collar, dress_bow, navel_cutout, veil, black_neckerchief, sleeveless_dress, two-tone_dress, blush, black_ribbon, dress_flower, closed_mouth, simple_background, sitting, white_background, armpits, neck_ribbon |
| 1 | 17 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, cleavage, frilled_dress, long_sleeves, looking_at_viewer, solo, two-tone_dress, blush, simple_background, jewelry, white_background, closed_mouth, detached_collar, skirt_hold, collarbone |
| 2 | 16 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, blush, cleavage, frilled_dress, long_sleeves, looking_at_viewer, solo, hair_bow, white_background, black_bow, jewelry, collarbone, simple_background, closed_mouth, detached_collar, hand_up, upper_body, parted_lips |
| 3 | 8 |  |  |  |  |  | 1girl, blue_bikini, double_bun, looking_at_viewer, solo, blush, braided_bun, choker, cleavage, simple_background, white_background, ahoge, collarbone, feather_boa, navel, halterneck, official_alternate_costume, parted_lips, twin_braids, upper_body |
| 4 | 13 |  |  |  |  |  | 1girl, blue_bikini, braided_bun, cleavage, double_bun, solo, looking_at_viewer, official_alternate_costume, single_thighhigh, blush, feather_boa, multi-strapped_bikini, navel, white_thighhighs, water, day, outdoors, ahoge, blue_sky, criss-cross_halter, twin_braids, pink_choker, sitting, cloud, aqua_bikini |
| 5 | 5 |  |  |  |  |  | 1girl, alternate_costume, pleated_skirt, school_uniform, simple_background, solo, white_background, looking_at_viewer, white_shirt, blue_skirt, collared_shirt, huge_breasts, long_sleeves, necktie, short_sleeves, thighhighs, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | official_alternate_costume | solo | white_dress | black_choker | looking_at_viewer | hair_ornament | black_sailor_collar | dress_bow | navel_cutout | veil | black_neckerchief | sleeveless_dress | two-tone_dress | blush | black_ribbon | dress_flower | closed_mouth | simple_background | sitting | white_background | armpits | neck_ribbon | bare_shoulders | black_dress | cleavage | frilled_dress | long_sleeves | jewelry | detached_collar | skirt_hold | collarbone | hair_bow | black_bow | hand_up | upper_body | parted_lips | blue_bikini | double_bun | braided_bun | choker | ahoge | feather_boa | navel | halterneck | twin_braids | single_thighhigh | multi-strapped_bikini | white_thighhighs | water | day | outdoors | blue_sky | criss-cross_halter | pink_choker | cloud | aqua_bikini | alternate_costume | pleated_skirt | school_uniform | white_shirt | blue_skirt | collared_shirt | huge_breasts | necktie | short_sleeves | thighhighs | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------------------|:-------|:--------------|:---------------|:--------------------|:----------------|:----------------------|:------------|:---------------|:-------|:--------------------|:-------------------|:-----------------|:--------|:---------------|:---------------|:---------------|:--------------------|:----------|:-------------------|:----------|:--------------|:-----------------|:--------------|:-----------|:----------------|:---------------|:----------|:------------------|:-------------|:-------------|:-----------|:------------|:----------|:-------------|:--------------|:--------------|:-------------|:--------------|:---------|:--------|:--------------|:--------|:-------------|:--------------|:-------------------|:------------------------|:-------------------|:--------|:------|:-----------|:-----------|:---------------------|:--------------|:--------|:--------------|:--------------------|:----------------|:-----------------|:--------------|:-------------|:-----------------|:---------------|:----------|:----------------|:-------------|:-----------------|
| 0 | 40 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | | X | | | X | | | | | | | | X | X | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | | X | | | X | | | | | | | | | X | | | X | X | | X | | | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | X | | | | X | | X | | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | X | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
jilp00/youtoks-water-diplomacy | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 174687
num_examples: 223
download_size: 75209
dataset_size: 174687
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_78_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13774313
num_examples: 7323
download_size: 7417324
dataset_size: 13774313
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_78_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PhanAnh/LOR_art | ---
license: creativeml-openrail-m
---
|
Quake24/paraphrasedPayPal | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_mlabonne__Gemmalpaca-2B | ---
pretty_name: Evaluation run of mlabonne/Gemmalpaca-2B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Gemmalpaca-2B](https://huggingface.co/mlabonne/Gemmalpaca-2B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Gemmalpaca-2B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T19:16:03.621570](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Gemmalpaca-2B/blob/main/results_2024-02-23T19-16-03.621570.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3683285110970895,\n\
\ \"acc_stderr\": 0.03391182933419809,\n \"acc_norm\": 0.3710380488758352,\n\
\ \"acc_norm_stderr\": 0.034683304905626926,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826845,\n \"mc2\": 0.412366178724327,\n\
\ \"mc2_stderr\": 0.014754217093653488\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4726962457337884,\n \"acc_stderr\": 0.014589589101985994,\n\
\ \"acc_norm\": 0.4872013651877133,\n \"acc_norm_stderr\": 0.014606603181012534\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5421230830511851,\n\
\ \"acc_stderr\": 0.004972042602001383,\n \"acc_norm\": 0.7136028679545907,\n\
\ \"acc_norm_stderr\": 0.004511533039406226\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.03496101481191181,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.03496101481191181\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633363,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633363\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\
\ \"acc_stderr\": 0.026985289576552735,\n \"acc_norm\": 0.3419354838709677,\n\
\ \"acc_norm_stderr\": 0.026985289576552735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.40414507772020725,\n \"acc_stderr\": 0.035415085788840193,\n\
\ \"acc_norm\": 0.40414507772020725,\n \"acc_norm_stderr\": 0.035415085788840193\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.023661296393964283,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.023661296393964283\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833713,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833713\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634342,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634342\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43119266055045874,\n \"acc_stderr\": 0.021233365030319563,\n \"\
acc_norm\": 0.43119266055045874,\n \"acc_norm_stderr\": 0.021233365030319563\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079103,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079103\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45588235294117646,\n \"acc_stderr\": 0.034956245220154725,\n \"\
acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.034956245220154725\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5042735042735043,\n\
\ \"acc_stderr\": 0.03275489264382133,\n \"acc_norm\": 0.5042735042735043,\n\
\ \"acc_norm_stderr\": 0.03275489264382133\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.46998722860791825,\n\
\ \"acc_stderr\": 0.017847723086649097,\n \"acc_norm\": 0.46998722860791825,\n\
\ \"acc_norm_stderr\": 0.017847723086649097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3815028901734104,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.3815028901734104,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3762057877813505,\n\
\ \"acc_stderr\": 0.027513925683549427,\n \"acc_norm\": 0.3762057877813505,\n\
\ \"acc_norm_stderr\": 0.027513925683549427\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.39197530864197533,\n \"acc_stderr\": 0.02716368603827123,\n\
\ \"acc_norm\": 0.39197530864197533,\n \"acc_norm_stderr\": 0.02716368603827123\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101366,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.318122555410691,\n\
\ \"acc_stderr\": 0.01189540728110409,\n \"acc_norm\": 0.318122555410691,\n\
\ \"acc_norm_stderr\": 0.01189540728110409\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.02406059942348742,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.02406059942348742\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.38562091503267976,\n \"acc_stderr\": 0.019691459052354143,\n \
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.019691459052354143\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n\
\ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n\
\ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.0383161053282193,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.0383161053282193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826845,\n \"mc2\": 0.412366178724327,\n\
\ \"mc2_stderr\": 0.014754217093653488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6558800315706393,\n \"acc_stderr\": 0.013352121905005943\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \
\ \"acc_stderr\": 0.008510982565520468\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Gemmalpaca-2B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|arc:challenge|25_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|gsm8k|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hellaswag|10_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T19-16-03.621570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T19-16-03.621570.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- '**/details_harness|winogrande|5_2024-02-23T19-16-03.621570.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T19-16-03.621570.parquet'
- config_name: results
data_files:
- split: 2024_02_23T19_16_03.621570
path:
- results_2024-02-23T19-16-03.621570.parquet
- split: latest
path:
- results_2024-02-23T19-16-03.621570.parquet
---
# Dataset Card for Evaluation run of mlabonne/Gemmalpaca-2B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Gemmalpaca-2B](https://huggingface.co/mlabonne/Gemmalpaca-2B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Gemmalpaca-2B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T19:16:03.621570](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Gemmalpaca-2B/blob/main/results_2024-02-23T19-16-03.621570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3683285110970895,
"acc_stderr": 0.03391182933419809,
"acc_norm": 0.3710380488758352,
"acc_norm_stderr": 0.034683304905626926,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826845,
"mc2": 0.412366178724327,
"mc2_stderr": 0.014754217093653488
},
"harness|arc:challenge|25": {
"acc": 0.4726962457337884,
"acc_stderr": 0.014589589101985994,
"acc_norm": 0.4872013651877133,
"acc_norm_stderr": 0.014606603181012534
},
"harness|hellaswag|10": {
"acc": 0.5421230830511851,
"acc_stderr": 0.004972042602001383,
"acc_norm": 0.7136028679545907,
"acc_norm_stderr": 0.004511533039406226
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.03496101481191181,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.03496101481191181
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633363,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633363
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.026985289576552735,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.026985289576552735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4484848484848485,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.4484848484848485,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.40414507772020725,
"acc_stderr": 0.035415085788840193,
"acc_norm": 0.40414507772020725,
"acc_norm_stderr": 0.035415085788840193
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.023661296393964283,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.023661296393964283
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833713,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833713
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634342,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634342
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43119266055045874,
"acc_stderr": 0.021233365030319563,
"acc_norm": 0.43119266055045874,
"acc_norm_stderr": 0.021233365030319563
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079103,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079103
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.034956245220154725,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.034956245220154725
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.38396624472573837,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.38396624472573837,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5042735042735043,
"acc_stderr": 0.03275489264382133,
"acc_norm": 0.5042735042735043,
"acc_norm_stderr": 0.03275489264382133
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.46998722860791825,
"acc_stderr": 0.017847723086649097,
"acc_norm": 0.46998722860791825,
"acc_norm_stderr": 0.017847723086649097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3762057877813505,
"acc_stderr": 0.027513925683549427,
"acc_norm": 0.3762057877813505,
"acc_norm_stderr": 0.027513925683549427
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39197530864197533,
"acc_stderr": 0.02716368603827123,
"acc_norm": 0.39197530864197533,
"acc_norm_stderr": 0.02716368603827123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101366,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.318122555410691,
"acc_stderr": 0.01189540728110409,
"acc_norm": 0.318122555410691,
"acc_norm_stderr": 0.01189540728110409
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.02406059942348742,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.02406059942348742
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.019691459052354143,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.019691459052354143
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.47761194029850745,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.47761194029850745,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.0383161053282193,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.0383161053282193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826845,
"mc2": 0.412366178724327,
"mc2_stderr": 0.014754217093653488
},
"harness|winogrande|5": {
"acc": 0.6558800315706393,
"acc_stderr": 0.013352121905005943
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.008510982565520468
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SALT-NLP/Impressions | ---
dataset_info:
features:
- name: image
dtype: image
- name: AnnotatorId
dtype: string
- name: ImgId
dtype: string
- name: caption
dtype: string
- name: Impact
dtype: float64
- name: image_description
dtype: string
- name: image_impression
dtype: string
- name: image_aesthetic_eval
dtype: string
- name: image_url
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 2366953929.024
num_examples: 1352
download_size: 2214475090
dataset_size: 2366953929.024
license: cc-by-sa-4.0
task_categories:
- image-to-text
- visual-question-answering
language:
- en
tags:
- art
pretty_name: Impressions
size_categories:
- 1K<n<10K
---
# Dataset Card for "Impressions"
## Overview
The Impressions dataset is a multimodal benchmark that consists of 4,100 unique annotations and over 1,375 image-caption pairs from the photography domain. Each annotation explores (1) the aesthetic impactfulness of a photograph, (2) image descriptions in which pragmatic inferences are welcome, (3) emotions/thoughts/beliefs that the photograph may inspire, and (4) the aesthetic elements that elicited the expressed impression.
EMNLP 2023 | [Paper](https://arxiv.org/abs/2310.17887)
## Additional Data
The Impressions dataset comes with more information that just image annotations on questions pertaining to *Pragmatic Description*, *Perception*, and *Aesthetic Evaluation*. For annotator personality and demographic metadata, as well as all *Aesthetic Impact* annotations, please see our [git repository](https://github.com/SALT-NLP/Impressions)!
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
patruff/chucklesD1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 400201
num_examples: 2793
- name: test
num_bytes: 101469
num_examples: 699
download_size: 134935
dataset_size: 501670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CreativeLang/ukp_novel_metaphor | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: lemmas
sequence: string
- name: poses
sequence: string
- name: metaphor_classes
sequence:
class_label:
names:
'0': '0'
'1': '1'
- name: novel_score
sequence: float64
- name: novel_metaphors
sequence:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 10443700
num_examples: 16018
download_size: 1768297
dataset_size: 10443700
---
# Dataset Card for "ukp_novel_metaphor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnderstandLing/oasst1_tr | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 78783063
num_examples: 80917
- name: validation
num_bytes: 3257047
num_examples: 3291
download_size: 26381260
dataset_size: 82040110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
autoevaluate/autoeval-eval-multi_news-default-52dcdc-44771145146 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: google/pegasus-multi_news
metrics: ['rouge']
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-multi_news
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Brez](https://huggingface.co/Brez) for evaluating this model. |
arthurneuron/crypto-futures-ohlcv-1m | ---
license: mit
---
|
autoevaluate/autoeval-staging-eval-project-6971abf9-7684955 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- masakhaner
eval_info:
task: entity_extraction
model: mbeukman/xlm-roberta-base-finetuned-amharic-finetuned-ner-amharic
metrics: []
dataset_name: masakhaner
dataset_config: amh
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: mbeukman/xlm-roberta-base-finetuned-amharic-finetuned-ner-amharic
* Dataset: masakhaner
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
OnlyCheeini/Llama-Discore-en | ---
size_categories:
- 1K<n<10K
--- |
LeoTungAnh/kdd210_hourly_24 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: feat_static_cat
sequence: uint64
- name: feat_dynamic_real
sequence:
sequence: float32
- name: item_id
dtype: string
- name: target
sequence: float64
splits:
- name: train
num_bytes: 18235479
num_examples: 210
- name: validation
num_bytes: 18275799
num_examples: 210
- name: test
num_bytes: 18316119
num_examples: 210
download_size: 47862588
dataset_size: 54827397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "kdd210_hourly_24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
breno30/ThVozJovem | ---
license: openrail
---
|
CyberHarem/sima_yi_reines_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sima_yi_reines/司馬懿〔ライネス〕/司马懿〔莱妮丝〕 (Fate/Grand Order)
This is the dataset of sima_yi_reines/司馬懿〔ライネス〕/司马懿〔莱妮丝〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, blue_eyes, black_headwear, tilted_headwear, breasts, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 635.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sima_yi_reines_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 558.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sima_yi_reines_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1206 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sima_yi_reines_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sima_yi_reines_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blue_jacket, long_sleeves, looking_at_viewer, simple_background, solo, black_skirt, blush, closed_mouth, peaked_cap, smile, white_background, white_gloves, white_headwear, armband, rose, small_breasts, pantyhose |
| 1 | 13 |  |  |  |  |  | 1girl, brown_gloves, closed_mouth, long_sleeves, solo, looking_at_viewer, smile, blush, jacket, simple_background, fur_collar, beret, upper_body, white_rose, white_background, dress |
| 2 | 10 |  |  |  |  |  | 1girl, blue_jacket, brown_gloves, flower, long_sleeves, looking_at_viewer, solo, black_pantyhose, green_eyes, beret, blue_dress, fur_collar, simple_background, white_background, blush, fur_trim, hand_up, blue_scarf, grin |
| 3 | 7 |  |  |  |  |  | 1girl, black_pantyhose, brown_gloves, flower, fur_collar, long_sleeves, sitting, solo, blue_jacket, blush, smile, dress, looking_at_viewer, closed_mouth |
| 4 | 8 |  |  |  |  |  | 1girl, aged_down, black_bow, long_sleeves, looking_at_viewer, short_hair, black_ribbon, green_dress, solo, beret, blush, green_headwear, open_mouth, bowtie, hair_ribbon, simple_background, teeth, white_background, :d, frilled_dress, hand_up |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, collarbone, completely_nude, penis, pussy, sex, small_breasts, blunt_bangs, cowgirl_position, flower, girl_on_top, looking_at_viewer, navel, spread_legs, vaginal, grin, mosaic_censoring, open_mouth, pov, sweat |
| 6 | 6 |  |  |  |  |  | 1girl, navel, nipples, pussy, small_breasts, solo, blush, collarbone, completely_nude, looking_at_viewer, smile, mosaic_censoring, closed_mouth, pillow |
| 7 | 6 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, covered_navel, looking_at_viewer, smile, solo, collarbone, cowboy_shot, small_breasts, blush, bare_arms, bare_shoulders, blunt_bangs, competition_swimsuit, day, highleg_swimsuit, wet |
| 8 | 9 |  |  |  |  |  | bare_shoulders, blue_sky, blush, day, looking_at_viewer, outdoors, black_bikini, flower, navel, 1girl, collarbone, small_breasts, solo, blunt_bangs, cloud, ocean, smile, beach, cleavage, medium_breasts, sand, bare_arms, blue_bikini, side-tie_bikini_bottom, sidelocks, thighs |
| 9 | 5 |  |  |  |  |  | 1girl, bare_shoulders, fake_animal_ears, hairband, looking_at_viewer, playboy_bunny, rabbit_ears, smile, solo, strapless_leotard, black_leotard, blush, collarbone, bare_arms, bare_legs, closed_mouth, medium_breasts, simple_background, wrist_cuffs, ass_visible_through_thighs, black_footwear, blue_leotard, blunt_bangs, brown_pantyhose, detached_collar, fishnet_pantyhose, flower, high_heels, highleg_leotard, lying, rabbit_tail, small_breasts, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_jacket | long_sleeves | looking_at_viewer | simple_background | solo | black_skirt | blush | closed_mouth | peaked_cap | smile | white_background | white_gloves | white_headwear | armband | rose | small_breasts | pantyhose | brown_gloves | jacket | fur_collar | beret | upper_body | white_rose | dress | flower | black_pantyhose | green_eyes | blue_dress | fur_trim | hand_up | blue_scarf | grin | sitting | aged_down | black_bow | short_hair | black_ribbon | green_dress | green_headwear | open_mouth | bowtie | hair_ribbon | teeth | :d | frilled_dress | 1boy | hetero | nipples | solo_focus | collarbone | completely_nude | penis | pussy | sex | blunt_bangs | cowgirl_position | girl_on_top | navel | spread_legs | vaginal | mosaic_censoring | pov | sweat | pillow | blue_one-piece_swimsuit | covered_navel | cowboy_shot | bare_arms | bare_shoulders | competition_swimsuit | day | highleg_swimsuit | wet | blue_sky | outdoors | black_bikini | cloud | ocean | beach | cleavage | medium_breasts | sand | blue_bikini | side-tie_bikini_bottom | sidelocks | thighs | fake_animal_ears | hairband | playboy_bunny | rabbit_ears | strapless_leotard | black_leotard | bare_legs | wrist_cuffs | ass_visible_through_thighs | black_footwear | blue_leotard | brown_pantyhose | detached_collar | fishnet_pantyhose | high_heels | highleg_leotard | lying | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:--------------------|:--------------------|:-------|:--------------|:--------|:---------------|:-------------|:--------|:-------------------|:---------------|:-----------------|:----------|:-------|:----------------|:------------|:---------------|:---------|:-------------|:--------|:-------------|:-------------|:--------|:---------|:------------------|:-------------|:-------------|:-----------|:----------|:-------------|:-------|:----------|:------------|:------------|:-------------|:---------------|:--------------|:-----------------|:-------------|:---------|:--------------|:--------|:-----|:----------------|:-------|:---------|:----------|:-------------|:-------------|:------------------|:--------|:--------|:------|:--------------|:-------------------|:--------------|:--------|:--------------|:----------|:-------------------|:------|:--------|:---------|:--------------------------|:----------------|:--------------|:------------|:-----------------|:-----------------------|:------|:-------------------|:------|:-----------|:-----------|:---------------|:--------|:--------|:--------|:-----------|:-----------------|:-------|:--------------|:-------------------------|:------------|:---------|:-------------------|:-----------|:----------------|:--------------|:--------------------|:----------------|:------------|:--------------|:-----------------------------|:-----------------|:---------------|:------------------|:------------------|:--------------------|:-------------|:------------------|:--------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | X | X | X | | X | X | | X | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | X | | | | | | | X | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | X | | X | | | | | | | | X | | X | | | | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | X | X | | X | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | X | | | | | | | | | X | | | | | | | X | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | X | | X | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | X | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | X | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | X | | X | | X | | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | X | | | | | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | X | | X | X | | X | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
camel-ai/physics | ---
license: cc-by-nc-4.0
language:
- en
tags:
- instruction-finetuning
pretty_name: CAMEL Physics
task_categories:
- text-generation
arxiv: 2303.17760
extra_gated_prompt: "By using this data, you acknowledge and agree to utilize it solely for research purposes, recognizing that the dataset may contain inaccuracies due to its artificial generation through ChatGPT."
extra_gated_fields:
Name: text
Email: text
I will adhere to the terms and conditions of this dataset: checkbox
---
# **CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society**
- **Github:** https://github.com/lightaime/camel
- **Website:** https://www.camel-ai.org/
- **Arxiv Paper:** https://arxiv.org/abs/2303.17760
## Dataset Summary
Physics dataset is composed of 20K problem-solution pairs obtained using gpt-4. The dataset problem-solutions pairs generating from 25 physics topics, 25 subtopics for each topic and 32 problems for each "topic,subtopic" pairs.
We provide the data in `physics.zip`.
## Data Fields
**The data fields for files in `physics.zip` are as follows:**
* `role_1`: assistant role
* `topic`: physics topic
* `sub_topic`: physics subtopic belonging to topic
* `message_1`: refers to the problem the assistant is asked to solve.
* `message_2`: refers to the solution provided by the assistant.
**Download in python**
```
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="camel-ai/physics", repo_type="dataset", filename="physics.zip",
local_dir="datasets/", local_dir_use_symlinks=False)
```
### Citation
```
@misc{li2023camel,
title={CAMEL: Communicative Agents for "Mind" Exploration of Large Scale Language Model Society},
author={Guohao Li and Hasan Abed Al Kader Hammoud and Hani Itani and Dmitrii Khizbullin and Bernard Ghanem},
year={2023},
eprint={2303.17760},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Disclaimer:
This data was synthetically generated by GPT4 and might contain incorrect information. The dataset is there only for research purposes.
---
license: cc-by-nc-4.0
---
|
Hojjat/so | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1765581
num_examples: 4777
- name: test
num_bytes: 497510
num_examples: 1326
- name: dev
num_bytes: 194781
num_examples: 530
download_size: 753841
dataset_size: 2457872
---
# Dataset Card for "so"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_double_obj_order | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 103727
num_examples: 660
- name: test
num_bytes: 78182
num_examples: 551
- name: train
num_bytes: 348640
num_examples: 2455
download_size: 334000
dataset_size: 530549
---
# Dataset Card for "MULTI_VALUE_stsb_double_obj_order"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SINAI/HEP | ---
license: cc-by-nc-sa-4.0
pretty_name: HEP
configs:
- config_name: default
data_files:
- split: hepth
path: Dataset/metadata-hepth.csv
- split: hepex
path: Dataset/metadata-hepex.csv
- split: astroph
path: Dataset/metadata-astroph.csv
---
---
# HEP - High Energy Physics collection.
## Description:
This corpus is oriented to the study of multi-labeled text classifiers. It is composed of scientific articles in the area of High Energy Physics (HEP) obtained from the CDS document server of the European Nuclear Physics Laboratory (CERN). The corpus is divided into three subsets (called partitions), where each partition is composed, in turn, of two files: one containing the records of each article (with information such as abstracts, authors and, of course, classes or keywords) in compressed XML format, and another containing a plain text version of the full article generated from the PDF available in the CERN databases (in tar + gzip format) The classes are delimited by the XML tag KEYWORD. These are the manually assigned DESY thesaurus tags. More information about the DESY thesaurus is available.
- hepth split: 18,114 Theoretical Physics documents (metadata - 5.3 Mb) (articles - 226 Mb)
- hepex split: 2,599 papers of Experimental Physics (metadata - 1.6 Mb) (articles - 28 Mb)
- astroph split: 2,716 Astrophysics documents (metadata - 1.1 Mb) (articles - 29 Mb)
### Licensing Information
HEP Collection is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
## Citation:
This corpus has been prepared by Arturo Montejo Ráez, with metadata provided by Jens Vigen and the help of the CDS Team.
```bibtex
@Article{montejo2004,
author = {Montejo-Ráez, A. and Steinberger, R. and Ureña-López, L. A.},
title = {Adaptive selection of base classifiers in one-against-all learning for large multi-labeled collections},
booktitle = {Advances in Natural Language Processing: 4th International Conference, EsTAL 2004},
pages = {1--12},
year = {2004},
editor = {Vicedo J. L. et al.},
location = {Alicante, Spain},
number = {3230},
series = {Lectures notes in artifial intelligence},
publisher = {Springer}
}
``` |
zhixiaoni/CROHME_try_black_jpg | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 65709500.15
num_examples: 8835
download_size: 71362419
dataset_size: 65709500.15
---
# Dataset Card for "CROHME_try_black_jpg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marmofayezi/M3EditMask | ---
dataset_info:
features:
- name: id
dtype: string
- name: original_image
dtype: image
- name: prompt
dtype: string
- name: mask
dtype: image
- name: edit_20_0.5
dtype: image
- name: edit_20_0.7
dtype: image
- name: edit_20_0.8
dtype: image
- name: edit_20_1.0
dtype: image
- name: edit_20_1.1
dtype: image
- name: edit_20_1.3
dtype: image
- name: edit_40_0.5
dtype: image
- name: edit_40_0.7
dtype: image
- name: edit_40_0.8
dtype: image
- name: edit_40_1.0
dtype: image
- name: edit_40_1.1
dtype: image
- name: edit_40_1.3
dtype: image
splits:
- name: train
num_bytes: 37113719.0
num_examples: 51
download_size: 24581207
dataset_size: 37113719.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nurik040404/mse | ---
license: wtfpl
annotations_creators:
- no-annotation
language:
- en
language_creators:
- machine-generated
multilinguality:
- monolingual
pretty_name: math-stackexchange-qa
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- math
task_categories:
- question-answering
- text-generation
- text-classification
task_ids:
- closed-domain-qa
- extractive-qa
- open-domain-qa
- dialogue-modeling
- language-modeling
- acceptability-classification
- text-scoring
---
# Mathematics StackExchange Dataset
This dataset contains questions and answers from Mathematics StackExchange (math.stackexchange.com). The data was collected using the Stack Exchange API. Total collected questions 465.295.
## Data Format
The dataset is provided in JSON Lines format, with one JSON object per line. Each object contains the following fields:
- `id`: the unique ID of the question
- `asked_at`: the timestamp when the question was asked
- `author_name`: the name of the author who asked the question
- `author_rep`: the reputation of the author who asked the question
- `score`: the score of the question
- `title`: the title of the question
- `tags`: a list of tags associated with the question
- `body`: the body of the question
- `comments`: a list of comments on the question, where each comment is represented as a dictionary with the following fields:
- `id`: the unique ID of the comment
- `body`: the body of the comment
- `at`: the timestamp when the comment was posted
- `score`: the score of the comment
- `author`: the name of the author who posted the comment
- `author_rep`: the reputation of the author who posted the comment
- `answers`: a list of answers to the question, where each answer is represented as a dictionary with the following fields:
- `id`: the unique ID of the answer
- `body`: the body of the answer
- `score`: the score of the answer
- `ts`: the timestamp when the answer was posted
- `author`: the name of the author who posted the answer
- `author_rep`: the reputation of the author who posted the answer
- `accepted`: whether the answer has been accepted
- `comments`: a list of comments on the answer, where each comment is represented as a dictionary with the following fields:
- `id`: the unique ID of the comment
- `body`: the body of the comment
- `at`: the timestamp when the comment was posted
- `score`: the score of the comment
- `author`: the name of the author who posted the comment
- `author_rep`: the reputation of the author who posted the comment
## Preprocessing
There was no preprocessing done, this dataset contains raw unfiltered data, also there might be problems with redundant line breaks or spacings
## License
This dataset is released under the [WTFPL](http://www.wtfpl.net/txt/copying/) license.
## Contact
For any questions or comments about the dataset, please contact nurik040404@gmail.com. |
xcadaf/julia | ---
license: openrail
---
|
dhiyagavaeikar/llama2_mental_health_conversations_smaller_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 459521
num_examples: 346
download_size: 243309
dataset_size: 459521
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kafuu_chino_istheorderarabbit | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kafuu Chino
This is the dataset of Kafuu Chino, containing 292 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 292 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 680 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 765 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 292 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 292 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 292 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 680 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 680 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 582 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 765 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 765 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
crazycat2413/MAD4401 | ---
license: other
license_name: mad
license_link: LICENSE
---
|
luizjr/marcos001 | ---
license: openrail
---
|
james-burton/OrientalMuseum_min3-name | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Aegis
'1': Ajaeng Holder
'2': Album Painting
'3': Amulet Mould
'4': Animal Figurine
'5': Animal Mummy
'6': Animal bone
'7': Arm Guard
'8': Axe Head
'9': Axle-caps
'10': Ball
'11': Ballista Bolt
'12': Band
'13': Basin
'14': Baton
'15': Bead Net
'16': Belt Hook
'17': Betel Nut Cutter
'18': Blouse
'19': Blu-ray disc
'20': Bolt
'21': Book Cover
'22': Box
'23': Brush Pot
'24': Brush Rest
'25': Brush Tray
'26': Bulb Bowl
'27': Bullet Mould
'28': Burnisher
'29': Cabinet
'30': Cannon
'31': Cap
'32': Carved stone
'33': Case
'34': Cash Box
'35': Chest
'36': Cigar Holder
'37': Clapper
'38': Clay pipe (smoking)
'39': Comb
'40': Compass
'41': Cosmetic and Medical Equipment and Implements
'42': Counterpoise
'43': Cricket pot
'44': Cross-bow Lock
'45': Cup And Saucer
'46': Cup, Saucer
'47': Cushion Cover
'48': DVDs
'49': Dagger
'50': Dice Box
'51': Dice Shaker
'52': Disc
'53': Domestic Equipment and Utensils
'54': Double Dagger
'55': Dummy
'56': Ear Protector
'57': Ear Stud
'58': Earring
'59': Elephant Goad
'60': Erotic Figurine
'61': Eye Protector
'62': Fan Case
'63': Feet Protector
'64': Ferrous object
'65': Figurine Mould
'66': File
'67': Finger Ring
'68': Fitting
'69': Flannel
'70': Flute
'71': Funerary Cone
'72': Funerary goods
'73': Funerary money
'74': Furosode
'75': Greek crosses
'76': Hand Jade
'77': Hand Protector
'78': Handwarmer
'79': Hanging
'80': Headband
'81': Heart Scarab
'82': Human Figurine
'83': Incense Holder
'84': Inkstick
'85': Jue (jade)
'86': Kite
'87': Knee Protector
'88': Kohl Pot
'89': Kundika
'90': Leaflet
'91': Leg
'92': Leg Protector
'93': Letter
'94': Lock
'95': Mah Jong Rack
'96': Majiang set
'97': Manuscript Page
'98': Massager
'99': Mat
'100': Mica Painting
'101': Miniature Painting
'102': Miniature Portrait
'103': Mortar
'104': Mould
'105': Mouth Jade
'106': Mouth Protector
'107': Mouth-piece
'108': Mummy Label
'109': Nail Protector
'110': Neck Guard
'111': Nose Protector
'112': Opium Pipe
'113': Opium Weight
'114': Oracle Bone
'115': Ostraka
'116': Paddle
'117': Palette
'118': Panel
'119': Part
'120': Pelmet
'121': Pencase
'122': Pendant
'123': Perfumer
'124': Phallus Protector
'125': Phylactery
'126': Pigstick
'127': Pipe
'128': Pipe Case
'129': Pipe Holder
'130': Pith Painting
'131': Plaque
'132': Plate
'133': Poh Kam
'134': Pounder
'135': Prayer Wheel
'136': Quoit
'137': Rank Square
'138': Rubber
'139': Sake Cup
'140': Scabbard Chape
'141': Scabbard Slide
'142': Scarab Seal
'143': Scarf
'144': Score Board
'145': Screen
'146': Seal
'147': Seal Paste Pot
'148': Shaft Terminal
'149': Shield
'150': Shroud Weight
'151': Sleeve Band
'152': Sleeve Weight
'153': Slide
'154': Soles
'155': Spillikins
'156': Staff Head
'157': Stamp
'158': Stand
'159': Stand of Incense Burner
'160': Stem Bowl
'161': Stem Cup
'162': Story Cloth
'163': Strainer
'164': Sword Guard
'165': Sword Knob
'166': T-shirts
'167': Table
'168': Table Runner
'169': Thangka
'170': Throwing Stick
'171': Tomb Figure
'172': Tomb Model
'173': Tongue Protector
'174': Washer
'175': Water Dropper
'176': Water Pot
'177': Wine Pot
'178': Womb Protector
'179': Woodblock Print
'180': Writing Desk
'181': accessories
'182': adzes
'183': alabastra
'184': albums
'185': altar components
'186': altars
'187': amphorae
'188': amulets
'189': anchors
'190': animation cels
'191': animation drawings
'192': anklets
'193': armbands
'194': armor
'195': armrests
'196': arrowheads
'197': arrows
'198': autograph albums
'199': axes
'200': 'axes: woodworking tools'
'201': back scratchers
'202': badges
'203': bags
'204': balances
'205': bandages
'206': bangles
'207': banners
'208': baskets
'209': beads
'210': beakers
'211': bedspreads
'212': bells
'213': belts
'214': bezels
'215': bi
'216': blades
'217': blowguns
'218': board games
'219': boats
'220': boilers
'221': bone
'222': booklets
'223': books
'224': bottles
'225': bowls
'226': boxes
'227': bracelets
'228': bread
'229': brick
'230': brooches
'231': brush washers
'232': brushes
'233': buckets
'234': buckles
'235': business cards
'236': buttons
'237': caddies
'238': calendars
'239': calligraphy
'240': candelabras
'241': candleholders
'242': candlesticks
'243': canopic jars
'244': card cases
'245': card tables
'246': cards
'247': carvings
'248': cases
'249': cash
'250': celestial globes
'251': censers
'252': chains
'253': chairs
'254': charms
'255': charts
'256': chess sets
'257': chessmen
'258': chisels
'259': chokers
'260': chopsticks
'261': cigarette cases
'262': cigarette holders
'263': cippi
'264': clamps
'265': clappers
'266': claypipe
'267': cloth
'268': clothing
'269': coats
'270': coffins
'271': coins
'272': collar
'273': combs
'274': compact discs
'275': containers
'276': coverings
'277': covers
'278': crucifixes
'279': cuffs
'280': cups
'281': cushions
'282': cutlery
'283': cylinder seals
'284': deels
'285': deity figurine
'286': diagrams
'287': dice
'288': dishes
'289': document containers
'290': documents
'291': dolls
'292': doors
'293': drawings
'294': dresses
'295': dressing gowns
'296': drums
'297': dung-chen
'298': earrings
'299': embroidery
'300': ensembles
'301': envelopes
'302': 'equipment for personal use: grooming, hygiene and health care'
'303': ewers
'304': fans
'305': fasteners
'306': 'feet: furniture components'
'307': female figurine
'308': ferrules
'309': fiddles
'310': figures
'311': figurines
'312': finials
'313': fishhooks
'314': flagons
'315': flags
'316': flasks
'317': flint
'318': fragments
'319': funnels
'320': furniture components
'321': gameboards
'322': games
'323': gaming counters
'324': ge
'325': glassware
'326': gloves
'327': goblets
'328': gongs
'329': gowns
'330': greeting cards
'331': hair ornaments
'332': hairpins
'333': hammerstones
'334': handkerchiefs
'335': handles
'336': handscrolls
'337': hanging scrolls
'338': harnesses
'339': hatpins
'340': hats
'341': headdresses
'342': headrests
'343': heads
'344': headscarves
'345': helmets
'346': hobs
'347': hoods
'348': hooks
'349': houses
'350': identity cards
'351': illuminated manuscripts
'352': incense burners
'353': incense sticks
'354': ink bottles
'355': inkstands
'356': inkstones
'357': inkwells
'358': inlays
'359': iron
'360': jackets
'361': jar seal
'362': jars
'363': jewelry
'364': jue
'365': juglets
'366': jugs
'367': kayagum
'368': keys
'369': kimonos
'370': knives
'371': kŏmun'gos
'372': ladles
'373': lamps
'374': lanterns
'375': lanyards
'376': leatherwork
'377': lids
'378': lockets
'379': loom weights
'380': maces
'381': manuscripts
'382': maps
'383': maquettes
'384': masks
'385': medals
'386': miniatures
'387': mirrors
'388': miscellaneous
'389': models
'390': money
'391': mortarboards
'392': mounts
'393': mugs
'394': mummies
'395': musical instruments
'396': nails
'397': necklaces
'398': needles
'399': netsukes
'400': nozzles
'401': obelisks
'402': obis
'403': oboes
'404': oil lamps
'405': ornaments
'406': overdresses
'407': pages
'408': paintings
'409': paper money
'410': paperweights
'411': papyrus
'412': passports
'413': pectorals
'414': pendants
'415': pennants
'416': pestles
'417': petticoats
'418': photograph albums
'419': photographs
'420': pictures
'421': pins
'422': pipes
'423': pitchers
'424': plaques
'425': plaster
'426': playing card boxes
'427': playing cards
'428': plinths
'429': plumb bobs
'430': plumbing fixtures
'431': plume holders
'432': poker
'433': pommels
'434': postage stamps
'435': postcards
'436': posters
'437': pots
'438': pottery
'439': prayer beads
'440': prayers
'441': printing blocks
'442': printing plates
'443': prints
'444': punch bowls
'445': puppets
'446': purses
'447': puzzles
'448': pyxides
'449': quilts
'450': rag-dung
'451': razors
'452': reliefs
'453': rifles
'454': rings
'455': robes
'456': roofing tile
'457': rosaries
'458': rose bowls
'459': rubbings
'460': rugs
'461': rulers
'462': sandals
'463': saris
'464': sarongs
'465': sashes
'466': sauceboats
'467': saucers
'468': saws
'469': scabbards
'470': scaraboids
'471': scarabs
'472': scarves
'473': scepters
'474': scissors
'475': scrolls
'476': sculpture
'477': seed
'478': seppa
'479': shadow puppets
'480': shawls
'481': shears
'482': shell
'483': shelves
'484': sherds
'485': shields
'486': shoes
'487': shrines
'488': sistra
'489': situlae
'490': sketches
'491': skewers
'492': skirts
'493': snuff bottles
'494': socks
'495': spatulas
'496': spearheads
'497': spears
'498': spittoons
'499': spoons
'500': stampers
'501': staples
'502': statues
'503': statuettes
'504': steelyards
'505': stelae
'506': sticks
'507': stirrup jars
'508': stools
'509': stoppers
'510': straps
'511': studs
'512': styluses
'513': sugar bowls
'514': sugar tongs
'515': swagger sticks
'516': swords
'517': tablecloths
'518': tablets
'519': tacks
'520': talismans
'521': tallies
'522': tangrams
'523': tankards
'524': tea bowls
'525': tea caddies
'526': tea kettles
'527': teacups
'528': teapots
'529': telephones
'530': ties
'531': tiles
'532': toggles
'533': toilet caskets
'534': tools
'535': toys
'536': trays
'537': trimming
'538': trophies
'539': trousers
'540': trumpets
'541': tubes
'542': tureens
'543': tweezers
'544': typewriters
'545': underdresses
'546': underwear
'547': unidentified
'548': urinals
'549': ushabti
'550': utensils
'551': vases
'552': veils
'553': vessels
'554': votive offerings
'555': waistcoats
'556': wall tile
'557': watches
'558': weighing devices
'559': weight
'560': weights
'561': whetstones
'562': whistles
'563': whorls
'564': wire
'565': wood blocks
'566': writing boards
'567': xylophones
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: train
num_bytes: 2508224108.0549493
num_examples: 23325
- name: validation
num_bytes: 752601542.4750253
num_examples: 5489
- name: test
num_bytes: 745167581.1930255
num_examples: 5489
download_size: 3874244610
dataset_size: 4005993231.723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Felladrin/ChatML-ultrachat_200k | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-generation
pretty_name: UltraChat 200k
---
[HuggingFaceH4/ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("HuggingFaceH4/ultrachat_200k", split="train_sft")
def format(columns):
return { "text": tokenizer.apply_chat_template(columns["messages"], tokenize=False) }
dataset.map(format).select_columns(['text', 'prompt', 'prompt_id']).to_parquet("train.parquet")
```
|
izumi-lab/sciq-ja-mbartm2m | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- ja
license:
- cc-by-nc-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- sciq
task_categories:
- question-answering
task_ids:
- closed-domain-qa
paperswithcode_id: sciq
pretty_name: SciQ-ja
dataset_info:
features:
- name: question
dtype: string
- name: distractor3
dtype: string
- name: distractor1
dtype: string
- name: distractor2
dtype: string
- name: correct_answer
dtype: string
- name: support
dtype: string
splits:
- name: test
num_bytes: 603074
num_examples: 1000
- name: train
num_bytes: 6996445
num_examples: 11679
- name: validation
num_bytes: 600296
num_examples: 1000
download_size: 4523396
dataset_size: 8199815
---
# Dataset Card for "sciq-ja-mbartm2m"
## Dataset Description
This is the Japanese Translation version of [sciq](https://huggingface.co/datasets/sciq).
The translator used in it was [facebook/mbart-large-50-many-to-many-mmt](https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt).
## License
The same as the original sciq (cc-by-nc-3.0).
|
BramNH/home-assistant-nl | ---
license: mit
language:
- nl
--- |
Thanmay/carb_seq2seq | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: 'null'
- name: POS
struct:
- name: tags
sequence: string
- name: words
sequence: string
- name: SynDP
struct:
- name: tags
sequence: string
- name: words
sequence: string
- name: SemDP
struct:
- name: tags
sequence: string
- name: words
sequence: string
splits:
- name: test
num_bytes: 867024
num_examples: 641
download_size: 305282
dataset_size: 867024
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.